[GUEST BLOG] Key Takeaways from CET’s Artificial Intelligence (AI) Research Computing Symposium
by William J. Rowe, Ph.D.
Recently, the College of Engineering and Technology hosted a day-long symposium bringing together industry leaders, faculty and students to discuss trends and technology in research computing. Technical experts from IBM, Nvidia, LSU and NCSU examined the expanding importance, applications and implications of artificial intelligence (AI), high-performance computing and data analytics.
If you did not have a chance to attend, here are three key takeaways from the event:
- AI is huge. AI is leading to more power than we can imagine, leveraging machine learning, deep neural networks, high-performance computing and data analytics. Computers are becoming smarter and faster at an increasingly rapid pace. Some computers are even capable of recognizing body language. However, humans and computers have strengths and limitations. Computers are efficient at structured problem solving and recognizing patterns while humans are better at unstructured problem solving. Computers excel at reading data quickly while humans are strong at interpreting data to find meaning. AI strives to utilize computers core strengths and capabilities to support the strength of human involvement.
- AI is likely to replace jobs currently held by humans but to what extent? Many technological advances stem from the desire to solve problems with greater efficiency, effectiveness and lower cost. There is a growing trend of using fewer people to complete the same amount of work (see kiosks in fast service restaurants). In the pursuit of reducing cost and remaining competitive, companies continue to seek out faster, cheaper and deeper methods of problem solving. This is where AI can help.
Yes, this means many people will be replaced but not all. The goal of AI is to support existing knowledge and enhance current human performance. For example, a recent study found that cancer radiologic imaging recognition software was able to detect breast cancer in mammogram results with 93 percent accuracy at an incredible pace of 1.5 seconds per scan. This AI-driven computer exceeded both human accuracy and speed. Here, the goal is not to eliminate the need for doctors but rather increase accuracy, improve patient outcomes, and save time (arguably the most valuable asset for people-centric organizations).
- AI is not coming. It’s already here. Companies like IBM and Nvidia are working at breakneck speed to rapidly improve and develop existing technology to bring innovative, disruptive ideas to market. The technology is intelligent, quick and already making its way into our day-to-day lives. Even though it may not be apparent, AI is driving major changes across business, healthcare and government organizations.
Although centered around technology, the symposium and information were not exclusive to those in the technology sector. The importance of partnership between business, technology, academia, and other areas were emphasized, as well as the joint conversations that need to occur for innovation to thrive.
Appreciation is given to the College of Engineering and Technology, IBM, Nvidia, LSU’s Center for Computational & Technology, and all other participants who made the event possible.
William J. Rowe, Ph.D. is an associate professor in the Marketing and Supply Chain Management department at East Carolina University’s College of Business. He is currently investigating the impact of blockchain technology on business. Dr. Rowe can be contacted at rowew@ecu.edu.