Semiconductors making people’s lives better in AI Era
Mr. John Yong-In Park
Since joining Samsung Electronics in 2014 as a senior vice president of the next-generation product development team, he has been responsible for the development of next-generation products like touch controller ICs and wearable bio-processors. He has led R&D for image sensors and System LSI products, such as display solutions, power ICs, and security solutions from 2015 to 2018.
As an executive vice president of the sensor business team since 2019, he led R&D and commercialization of advanced pixel technology and high-resolution sensors, such as the first 108MP CMOS image sensor. In 2020, he strengthened the strategic direction of the entire System LSI Business as an executive vice president and head of product planning, sales & marketing. With the development of diverse innovative solutions, he is driving the growth of Samsung’s System LSI Business.
Before Samsung, he had been president, chief executive officer and a member of board of directors at Dongbu Hitek since 2009. Before joining Dongbu, he held key technology management positions at Texas Instruments from 1999 to 2007 and was analog group leader at LG Electronics from 1987 to 1999.
Mr. Park received B.S. and M.S. degrees in electrical engineering from Yonsei University in 1987 and 1993 respectively.
Since the first definition of AI in the 1950s, numerous AI researches have been conducted, but due to the technical limitations of AI research, it has experienced several ups and downs until the early 2000s. These ups and downs were overcome based on three factors: (1) algorithm innovation (2) improvement in computing performance (3) formation of big data, and AI has entered a new turning point. Since then, remarkable achievements in AI research have been pouring out, and recently, large-scale generative AI has become a huge interest by creating numerous use cases not only in daily life but also in professional fields. In this flow, the four key elements of the Fourth Industrial Revolution (Hyper-Intelligence/ Hyper-Connected/ Hyper-Data/ Fundamental Tech.) will further accelerate the development of AI. Currently, AI is mainly implemented in a Cloud-Centric form, and recently, optimized solutions for On-Device AI are being proposed due to the TCO requirements of the Cloud and the demand for personalized services. In the future, its form and utilization will expand to Proactive AI that can make decisions on its own. This will be realized through the evolution of Hyper-Connected technology that connects all smart devices over Earth through ultra-high-speed/low-latency/broadband 5G and 6G, satellite communication, etc., and the qualitative and quantitative growth of Big Data’s Hyper-Data creation and Hyper-Intelligence implemented with processors and architectures specialized for AI form factor. Meanwhile, fundamental tech such as strong security and low-power power management systems will support more secure and robust AI implementation. On the other hand, the risks posed by AI should be recognized and a responsibility to use and develop it in the right direction is required. Only through this, AI era that benefits humanity would be created.
Precious Memories in the Nanoscale Era
Dr. Rajiv Joshi
Dr. Rajiv V. Joshi is an IEEE Fellow, winner of the prestigious IEEE Daniel Noble award, and a key technical lead/Research Scientist at T. J. Watson research center, IBM. He received his B.Tech IIT (Bombay, India), M.S (MIT), and Dr. Eng. Sc. (Columbia University). He has led successfully predictive failure analytic techniques for yield prediction and also the technology-driven SRAM at IBM Server Group. His statistical techniques are tailored for machine learning and AI which are licensed and commercialized. He received 3 Outstanding Technical Achievement (OTAs), 3 highest Corporate Patent Portfolio awards for contributions in interconnect technologies, holds 70 invention plateaus, and has over 284 US patents covering front end and back end of the line processes, and structures, volatile and non-volatile memories, Compute in Memory structures, machine learning algorithms and quantum computing and over 430 international patents. He has authored and co-authored over 225 papers and given over 60 invited/keynote talks and given several Seminars. He received the NY IP Law Association “Inventor of the Year” award in Feb 2020. He received an industrial pioneer award in 2014 from IEEE Circuits and Systems Society. He received the Best Editor Award from the IEEE TVLSI journal. He is inducted into the New Jersey Inventor Hall of Fame in Aug 2014. He won the Mehboob Khan award two times from Semiconductor Research Corporation. He won several best paper awards from ISSCC 1992, ICCAD 2012, ISQED, and VMIC. He is a member of the IBM Academy of Technology and a master inventor. He serves on the Board of Governors for IEEE CAS as an industrial liaison. He serves as an IEEE CAS Ambassador to India. He served as a Distinguished Lecturer for IEEE CAS, CEDA, and EDS society. He is an ISQED and World Technology Network fellow and distinguished alumnus of IIT Bombay.
Conventional Memories like SRAMs are the workhorse of the semiconductor industry. This is more so for servers, main processors, the Internet of Things (IoT) and System on Chip (SOC), Artificial Intelligence (AI), and other emerging applications such as quantum computing. However, scaling beyond sub-10nm technology puts the burden on memories for functionality, performance, area, power, and yield. The key areas of volatile memories e.g. a wide variety of SRAMs, such as reduction in active power, leakage power, short circuit power, and collision power are highlighted in this talk. Novel Read/write assist techniques are key for proper functionality and recent developments are highlighted to demonstrate extremely low voltage operations (Vmin) amid process, geometry, and environment (Voltage and temperature) induced device mismatches.
Artificial Intelligence (AI) is trying to mimic human Intelligence pushing the limits of big data analytics and thereby computing power as well as storage capability. This will increase the usage of GPUs, CPUs, and volatile and nonvolatile memories for data-centric accelerators used for all neural networks (NN), IoT as well as high-performance processors. Such demand requires a reduction in energy usage for sustainability and throughput improvement for performance The talk covers memory solutions for AI as well as Quantum applications at extremely low Vmin (below 0.3V) with the intention of reducing power. To improve memory wall problems related to latency in-memory computation techniques are explored and opportunities are brought out. The talk finally summarizes challenges and future directions for various memory applications.