Artificial Intelligence and the Analog Revolution

For many, it doesn’t feel like long ago that we made the switch from analog to digital television. Since 1990, digital technologies have been on the rise and we can no longer live without them. Processors, cameras and other components that make up our digital devices have grown enormously in efficiency, speed and power. But with the introduction of artificial intelligence (AI) into our daily lives, the limits of this progress are quickly becoming apparent. So how is it possible that artificial intelligence is still seen as the enabling technology of this century?

The hardware makes it possible (or does it?)

To discover the limits and shortcomings of current techniques, we must first return to the basics of computer technology. The processors and memories that make all calculations and algorithms possible are based on ones and zeros. These binary streams make it possible to calculate everything with unimaginable accuracy down to the decimal point. Developments in these techniques jumped from 1 bit, to 4, 8, 32 and now 64 bit systems. Because of these developments, a computer can use and store many more numbers simultaneously in its programmes.

Scaling up these systems to work more, faster and more efficiently has meant that we can analyse millions of data points and train machine learning models up to 530 billion parameters in the ‘Megatron-Turing Natural Language Generation model’. But how future-proof is it to train such absurd numbers on supercomputers that run day and night and consume 2,646 kW? For example: an average household consumes less in a whole year with 2479 kWh!

Our brain is actually a more efficient computer

The answer is simple: we need to become more sustainable. The current processors (CPUs and GPUs) that are responsible for the calculation work in artificial intelligence systems are not suitable for the increasing data pressure. It simply costs too much energy to keep these models running and to keep developing them at the current pace of innovation.

Fortunately, there is a solution: approximate calculations, also known as ‘fuzziness’. This is derived from the forerunners in the field of intelligence: humans. In order to recognise an animal in a photo or a word in a text, we do not need all the information. The human brain is not based on arithmetic accurate to hundreds of decimal places. We may only need 20% of a photo, or just the outline of an animal, to be able to recognise it. Or take, for example, a typo or a word with missing letters. Most people will still be able to understand the meaning of it. The lesson? The accuracy of the end result does not depend on the accuracy of the additional intermediate steps. Applying this principle in processors can save exponentially on scale and energy consumption. A 64-bit system used for AI can go back to 8-, 4- or perhaps even 2-bit, with all the savings that entails.

AI by Design

Apart from the fact that the maximum accuracy of calculations in a neural network does not determine the optimal performance of an AI algorithm, there are other hardware developments possible to use our resources more efficiently. I call it AI by Design, better known as ‘Analog AI’. Analog artificial intelligence is made possible by changes in semiconductor architectures. As a result, chips will no longer be designed according to the von Neumann architecture. Instead, the design of neural networks will be translated into programmable nodes on the chip itself, by means of ‘Phase-Change Memory (PCM)’. Estimated energy savings with this technique compared to traditional CPUs are a factor of 10,000 to 2,000,000 per calculation. Spread this saving over billions of individual calculations required to train a neural network and the savings are immense.

Opportunities for subsidies

That artificial intelligence is a powerful technology is beyond dispute, but to make it truly future proof we still have a long way to go. In particular, much improvement is needed in the field of sustainability and performance. Knowledge institutions, the semiconductor industry and developers of AI models have a lot of work to do (whether or not together) to bring solutions from the lab to practice.

In other words, analog AI is going to create enormous opportunities for the entire sector in the coming years, from research to design, testing and production. Investments will be made across the board to make this technology a reality. A topic of interest for subsidy providers at home and abroad. Think of Horizon Europe or EFRO, but also the MIT R&D is very suitable for collaborations in this field.

More information about AI grants

Are you curious for more information? Or do you want to contribute to these challenges? I would be happy to discuss these developments, how to form a consortium or the subsidy possibilities for your AI project. Please contact me at k.olderikkert@hezelburcht.com or call me directly at 088 495 20 00 for a one-on-one meeting.

In need of personal advice? Contact us
Or call us directly: 088 495 20 00

contact

Get in touch with specialist Koen Olde Rikkert at k.olderikkert@hezelburcht.com or call directly at 088 495 20 00. Or leave your details using the form below:

More information?

Contact us!

?

Questions or more information?

Feel free to contact us: