AI

How AI and GenAI intersect and differ

Generative artificial intelligence (GenAI) is having a moment, but despite all the attention it’s grabbing, it comprises a small portion of the AI and machine learning that’s being done today.

At its foundation, AI is automation, allowing systems to do things without human intervention – the intelligence is expressed in the complexity of tasks that can be done automatically and the ability to make decisions.

“The term ‘AI’ should not be interpreted as this black and white event,” Andy Hilliard, CEO of Accelerance, told Fierce Electronics. The emergence of AI is not a single, disruptive event either – it’s been an evolutionary path, he said, much like the industrial revolution in that machines slowly took over human activities. Over time, this machine learning has been dubbed AI.

Hilliard said software developers have been creating products that have become increasingly smarter. The Excel spreadsheet is a great example of something that was paper-based and now has intelligence at the back end with an interface that makes it more dynamic and easier for everyone to use, he said.

GenAI helps to speed up software development because it can help create better scripts, check for errors, and recommend coding structures, Hilliard said. This is an example of how software development is being democratized, and AI is following the same path. He said the speed of evolution feels revolutionary because it’s happening so fast.

There are still limits to what AI can do because of the complexity of the interrelationship between many variables that are going on all at once, which is why driving has yet to be fully automated, Hilliard said.

What’s real and what isn’t

There are many types of AI that have been envisioned but most remain strictly theoretical – IBM, known for watsonx.ai, has laid out the many broad categories and sub-categories of AI.

“Theory of Mind AI” sounds like something out of a science fiction novel – it would be able to understand the thoughts and emotions of other entities, allowing it to mimic human-like interactions. Similarly, “Self-Aware AI” is also theoretical and would have its own set of emotions, needs and beliefs.

These types of AI fall under broader buckets of either General AI or Super AI, which are both theoretical. Any AI that exits today, whether emerging or broadly adopted, is known as Artificial Narrow AI – it can’t perform outside of its defined task, that’s why it’s narrow.

GenAI falls into a category known as “Limited Memory AI,” and can predict words, phrases, or visual elements with the content it’s generating – ChatGPT is a good example. Virtual assistants and chatbots such Alexa and Seri also fall into this category by combining Limited Memory AI with natural language processing (NLP) to understand and answer questions and act based on user requests. The reason self-driving cars can be autonomous is because they use Limited Memory AI to understand their environment and make decisions in real-time, but autonomous vehicles have taken longer to become commonplace.

Reactive Machines AI is a less intelligent than Limited Memory AI – systems in this category have no memory and are designed to perform an extremely specific task based on data available in the moment, such as recommendation engines used by e-commerce sites and streaming services.

The majority of Narrow AI that’s in use is quite practical – computer vision that can be trained to visually identify and classify objects within images and video footage to detect and track them. Aside from self-driving cards, computer vision is also combined with industrial robots to optimize manufacturing and warehouse logistics. Presence detection and facial recognition can be used for security and environmental control applications.

Painting a clearer picture

While AI appears to be revolutionizing some industries over night, many advances have been a slow burn, with capabilities steadily improving over time. Imaging applications are excellent examples of where AI is common and more mature.

Visidon was founded in 2006 and develops software to enable hardware-independent image processing. Its algorithms are embedded in more than one billion smartphones and other devices such as laptops, video conferencing systems and security cameras.

Machine vision and sensing existed long before AI but are now getting augmented by AI. Vaida Jasulaityte, business development director, told Fierce Electronics that its algorithm is not only improving image processing quality but also addressing power constraints as devices get smarter, including those operating at the edge.

RELATED: Power-hungry AI chips face a reckoning, as chipmakers promise ‘efficiency’

Visidon’s algorithms provide quality enhancement and noise reduction by using intelligent scaling to process the pixels, which is helpful in situations where the field of vision contains shadows and low lighting, which is important for security applications, Jasulaityte said.  “We don't create new details, but we remove the noise that distorts the image.”

The company’s algorithms are also used for video analytics and as well as drone photography and capturing data for defect detection.

Jasulaityte said with sufficient processing power, AI can improve traditional computer vision, but the algorithms must still be small. “They need to be specifically trained for the device on which they are embedded and able to run in real time; they need to be well optimized.”

For security applications, the processing needs to be done on the camera at the edge so that it can be done in real time.

Augmenting the edge

Lean algorithms and software can benefit from optimized hardware. Infineon Technologies recently introduced microcontrollers to support edge AI/ML applications and avoid the latency that comes with sending data back to the cloud.

Steve Tateosian, senior VP of Infineon Technologies’ consumer, industrial and IoT MCUs, IoT, compute and wireless business, said its MCUs give developers more capabilities while staying within a set power envelope to allow for more AI to be done at the edge.

He said it’s important to realize that even when the bulk of data is sent back to the cloud for processing, the actual device pays a price because turning its radio on and off to send and wait for that data has implications for power. Tateosian said Infineon sees its PSOC Edge MCU series as game change because they can enable a lot more capability in the same power envelope or even slightly lower power envelope.

One of the key challenges with GenAI today is power consumption, so you would think that GenAI won’t end up at the edge, but Tateosian’s controversial take is that smart devices are going to get smarter in that users will be able to ask them questions to operate them – a smart thermostat, for example, will be able to able to leverage language models to instruct users on how to connect, operate and use them.

“There is a more targeted place for generative AI happening at the edge because the edge products are known,” he said. “My thermostat knows it's a thermostat. My washing machine knows it's a washing machine. My robot on the assembly floor knows it puts windows into cars. That's all it does.”

Matt Jones, GM of IP Cores at Rambus, said GenAI as it is now has brought attention to AI as a whole as it evolves from 1.0 – recommendation engines and interactive voice applications such as Siri and Alexa – to Gen 2.0, which is when AI is applied more widely and practically – less sexy use cases like inspection of assembly lines and preventative maintenance.

The next wave of AI, Jones said, will be about helping to augment the workforce, which of course is accompanied by fears of AI displacing the workforce. But he also noted that power remains a major hurdle, especially on the GenAI front.

People displacement, power consumption remain concerns

Steven Woo, fellow and distinguished inventor at Rambus, said that moving data consumes a lot of power, and often accounts for the bulk of consumption when using memory types that are preferred for AI, such as High Bandwidth Memory (HBM). “This data movement is such a big problem for AI and it's only getting worse.”

He said one solution is to increase compute density, which seems counter intuitive, since that would drive power consumption up, but it also reduces how often the data needs to be moved. Woo said Nvidia’s Blackwell Architecture for GenAI is a good example of that approach.

The models themselves, which are expensive to train, are becoming more valuable, he said, which means security is even more important.

For the foreseeable future, however, the bulk of AI activity is not GenAI. Rather, it’s what Carlos Morales, VP of AI at Ambiq, calls “supervised,” AI. Like Infineon, Ambiq offers MCUs to allow for more AI to be done at the edge while remaining within desirable power envelopes.

Training GenAI involves throwing a lot of content at it, while supervised AI involves labeling data – this is how an ECG recognizes a heart attack. “GenAI is very sexy and it’s taking all the attention right now,” he said.

And while GenAI will be quite disruptive, all the other AI is growing three or four times faster because it runs everything, Morales said, and data quality is critical to avoid “garbage in, garbage out.”

Supervised learning for smaller AI tasks must be more careful with data, especially medical use cases, he said. “You don't want garbage in, garbage out.”

Applying reinforcement learning includes a human who is looking at the first few outputs to identify errors, Morales said.

This learning can then be applied to edge AI applications that wouldn’t be effective if the data had to be sent back to a cloud, such as fall detection for seniors living on their own, which Morales said is the prevalent background AI that is making life better through smart automation. “It’s running everywhere.”

Practical, real-world AI growth outpaces GenAI

Machine-learning based predictive analytics are another unsophisticated, real-world AI application that has permeated many industries.

While faster, efficient hardware is critical for the advancement of more complex AI, for practical applications, software is essential for optimizing decision processes in complex environments, such as supply chains, including manufacturing facilities and warehouses.

Cosmo Tech offers an AI simulation tool that helps organizations make decisions based on data gleaned from a 360 degree twin of an enterprise organization. Cosmo Tech co-founder Michel Morvan said AI has become an essential tool for finding ways to reduce costs and increase efficiency and profits while also becoming more resilient and sustainable.

All of these can be affected by a single decision or disruption to the supply chain – a boat stuck in the Suez Canal, for example.

While it wasn’t possible to predict that a boat might jam the canal, Morvan said, it is possible to simulate a scenario where the canal becomes no longer available for transit.

AI’s predictive capabilities can help to prepare for the impact of a major disruption and the many uncertainties across the supply chain, Morvan said. “You have to adapt to real time events.”

If an organization wants to increase profits and reduce carbon emissions, it’s now possible to determine what decisions must be made as well as the associated uncertainties and cascading effects.

Digital twins have become commonplace in chip design, but Morvan notes that Cosmo Tech leveraging predicative AI to make a digital twin of the future, not just a snapshot of a current system. “We are able to show how the system is going to evolve, which is extremely important when you want to make a decision.”

Using past data to predict the future has limitations, Morvan said, which is why no one predicted a boat being stuck in the Suez Canal. “You need to use more than that because you need to forecast.” AI can simulate the impact of potential events that have not happened in the past, he said.

Using predictive and prescriptive AI can help to make smarter decisions to optimize the supply chain and understand how a local event can have a cascading effect, Morvan said, but it must be continually scanning for key parameters that could affect the supply chain so it can propose an appropriate response.

Where this type of predictive AI and GenAI will meet is when it’s possible for users to query the system using natural language, Morvan said. “It's not the GenAI that is doing the prediction or the prescription.”