Is AI really necessary?: Page 2 of 3

February 01, 2019 //By Oliver Mihm
Is AI really necessary?
Artificial intelligence (AI) is the latest buzzword in the electronics industry, similar to how IoT (Internet of Things) was the hot topic a couple of years ago. Just as IoT was often used where connectivity would have been a better fit, AI is being used as the overarching term for processes that are not users of artificial intelligence in the true sense of the phrase.

Oliver Mihm is President of Plexus EMEA
- www.plexus.com

The key to adopting AI principles in the near term is to narrow the scope of integration to specific technologies. Many times, when individuals consider AI, they think of it from a general sense – an all-knowing machine capable of learning everything about the world around it. What is more practical, and will be more effective for companies incorporating intelligent technology, is the focus on how large sets of data can be incorporated to improve the effectiveness of the production at hand. Where there is more data to capture, the more the machine can be programmed to learn patterns and optimize effectiveness of the process. 

Where the electronics industry stands today is certainly at a stage of sophisticated manufacturing. Robotics can be defined as “intelligent”, as non-human object take in information, make a decision and reiterate the process. While this does not represent the full scale of what AI can do within manufacturing, it does showcase how the guiding principles of AI are being applied today. The introduction of true, full scale AI is coming, most likely within the customer’s products first before the manufacturing floor. While it may seem a slow process now, in ten years’ time the impact of the change will be widely apparent.


Vous êtes certain ?

Si vous désactivez les cookies, vous ne pouvez plus naviguer sur le site.

Vous allez être rediriger vers Google.