What are the Latest Jobs in AI?

Now that generative AI offerings such as ChatGPT have firmly entrenched themselves in tech workflows, new jobs are available to tech professionals with the right knowledge base. The companies that build and research AI are looking for experts in various AI and machine learning (ML) subfields. Even companies that don’t specialize in AI are creating in-house AI tools for proprietary functions (such as searching in-house databases).

Let’s take a look at what some of these new jobs are, what they entail, and what kind of background you need. And maybe one will jump out as your next career choice.

Ethics Specialists and AI Legal Experts

Ethics has always played a big part in computers (cookie tracking comes to mind), but AI introduces a giant load of new concerns and worries. Here are some random concerns that companies that build AI need to, well, concern themselves with:

  • When people using AI provide personal details inside a prompt, is the AI allowed to save those details and add to its knowledge base?
  • Furthermore, does it change the situation if such personal details are “anonymized”?
  • AI might not be able to distinguish truth from lies. What if an AI spreads lies about a political candidate? Can that candidate sue if he or she loses their election?
  • Somebody allegedly asked an AI how to make a dangerous substance used in chemical warfare. The AI refused. The person then allegedly asked the same AI what chemicals to avoid mixing together to create said substance, and the AI explained in detail what not to do. Can the AI be held responsible for indirectly providing instructions how to create a dangerous substance?
  • Can AI give advice? Although this example hasn’t been verified, there is presently a story going around the web claiming that Google’s AI said that somebody on Reddit suggested a rather unsavory and cruel “cure” for depression. Should an AI do such a thing? And if a person followed through with such bad advice, can the AI company be sued?
  • When an AI that produces images is “learning” and it explores the works of contemporary artists (i.e., works that are still under copyright), can it use what it learned from those images to build images that look completely different but have a general “look and feel” as the original? Can the AI do that? Should the AI get permission? Should the AI pay the original artist?

Several of these questions also open serious legal issues. Courts are going to rule, and lawyers will need to learn about these new legal precedents; and soon there will likely be lawyers specializing in AI. Downstream, that means tech professionals who focus on AI will also need to learn as much as possible about ethics, and even specialize in “ethical AI programming.”

LLM Engineer

Large Language Models (LLMs) are the key to today’s AI. In 2017, a research paper titled “Attention is All You Need” introduced a new concept called a transformer model, which was the breakthrough that opened up today’s AI as we know it. With that came the concept of a Large Language Model, which stores all the words AI has learned, and includes connections between those words based on context within sentences and paragraphs. LLMs are complicated to build and understand, but with the right background you can learn how they work and become an LLM Engineer.

By LLM engineer, we don’t mean writing code that uses LLMs. We mean writing the code that builds and manipulates the LLMs themselves. These people work for companies such as OpenAI, which built ChatGPT. But other companies exist too, and certainly AI startups abound. These companies need LLM engineers to create libraries of code that manipulate LLMs. Where do you begin?

First you need to be strong in mathematics. When you think back to learning trig in high school, you’ll recall studying how to determine sines and cosines of angles that are flat on the page, which is two dimensions. LLMs use trig to determine how similar human words are to each other; however, the trig calculations operate over hundreds of dimensions, not just two. Further, your math knowledge needs to extend into advanced calculus including partial derivatives. You also need a strong knowledge of probability.

The required math is indeed complex, but the good news is that if you studied math or computer science or physics in college, then you probably learned much of the required math already; you don’t necessarily need to have a master’s or PhD in mathematics. Additionally, you’ll need to be strong in various programming languages such as Java, Python, and C++.

ML Engineer

Machine learning has been around for quite some time, long before the advent of the transformer model mentioned earlier. Amazon has done lots of research into machine learning so that it can make product suggestions based on your purchases and interactions with their website, for example.

ML engineers need to know how to gather and analyze enormous amounts of data, and how to “clean” the data (meaning spotting and removing errors in the data), and from there build large models (such as LLMs, but they could be other types of models), and train the models appropriately. From there they need to know how to extract new information from the models. In the case of Amazon, this means extracting new information based on Amazon’s massive product catalog to generate suggestions.

These models are likely huge and require cloud-based platforms such as AWS or Google. (In fact, Amazon Web Services was originally built internally by Amazon to support their huge data endeavors; they eventually realized others could benefit from using it.)

ML engineering isn’t limited to just product suggestions. One example is healthcare, where large models can be used to predict the likelihood of a patient developing a certain disease, as well as developing protocols for helping to prevent the disease through specific lifestyle changes and medications. Another example is in supply chains whereby a machine learning system can predict demand for certain products and in turn order the products or the raw materials to build the products.

How do you become an ML engineer? A typical path is to start by learning data analytics and data science. Then from there plan to take courses in mathematics similar to those we described for LLM engineer. Plan to learn programming in Java, C++, and Python.

AI Product Manager

AI companies like OpenAI operate quite differently from traditional software companies. Their primary object is to research and develop new ideas in AI. Building software is often a secondary role.

This adds a unique wrinkle to the development of the software, because researchers work very differently from software developers. For example, the timelines are more flexible and less predictable. Researchers do a lot of experimenting, sometimes without a clear goal or endpoint in sight. They may need to take many steps back. They might have multiple false starts. This makes managing products that rely on original research significantly more complex. Also, remember that in a lot of cases, software released by research companies are likely byproducts of the research, such as tools to test and validate the research.

ChatGPT, for example, has grown into a premium product for OpenAI, but originally it was a proof of concept for the research. In other words, often the research is the actual product for such companies, and the tools they produce are secondary to their research. This means managing the creation of such software requires much more patience and the realization that the tools may take much longer to build, and might need to be scrapped altogether as the research gets modified.

To become an AI product manager, you’ll need all the skills necessary to traditional product management, plus an understanding of how research organizations work. Then a basic understanding of how AI, including LLMs, works would be beneficial.

Software Developer

While this is certainly not a new field, today’s software developers are being asked more and more to integrate AI into their applications. You’re seeing this happen already in apps that you use. For example, Google is rolling out AI assistance on their search page. In order to make this happen, developers need to know how to work with AI software libraries.

In this particular case, you don’t need to become an expert on, say, building LLMs. Instead, you’ll want to learn as much as you can about LLMs and machine learning, and then learn how to use libraries that abstract away the hard work of working with such AI systems. That way you can focus your time on building the app you’re assigned and integrating AI in without having to become an LLM engineer.

For example, if you code in Python, you’ll want to check out ChromaDB. Many of the popular languages now have similar frameworks. In these cases, you don’t need to know how LLMs work but rather how to use them and put them to work.

AI Quality Assurance Tester

How in the world do you test an AI tool? Chatbots like ChatGPT would be especially hard to test. In normal software development, when you write a function, you can expect that the function will give you consistent results when it’s fed consistent data. That’s not true with AI. When you feed a question such as “What is the nature of the universe?” into a generative AI tool, you’re not going to be able to verify that it provides an expected and consistent response.

Traditional QA testing skills do still matter, even when testing chatbots. For example, LLMs use mathematical formulas based around trigonometry to determine whether certain words in certain contexts are similar. The results of these calculations will indeed be consistent and predictable for those using a manual QA process. And that’s where you would use your QA skills in the context of AI software.

That means learning everything you can about software testing, but also understanding the bigger picture of how AI tools work, as well as how parts of AI such as LLMs work.

Conclusion

AI has created a new industry that will be growing for many years to come. The careers we’ve mentioned here are just a small sampling. Many of the new careers are updates of older careers such as software developer. As new AI companies form, they’ll need customer support, salespeople, managers, financial people, chief technical officers, CEOs, you name it—and all of these positions will require at least some level of understanding of how AI works.

Leave a Comment