Course Title: AI 2121- AI IN ACTION: EXPLORING 1
APPLICATIONS AND IMPACT
QUARTER 1: INTRODUCTION TO AI
Week 1: Artificial Intelligence: Defined
Objectives:
- This module aims:
1. To define artificial intelligence; and
2. To discuss the fundamentals of artificial intelligence.
Introduction to artificial intelligence
Applications of artificial intelligence are everywhere, but what does it mean?
Kumar Abhishek outlines the development and history of artificial intelligence
in this article.
The capacity of robots to mimic or improve human cognition, such as reasoning
and experience-based learning, is known as artificial intelligence (AI).
Although artificial intelligence has long been included in computer programs,
it is utilized in various goods and services. For instance, certain digital cameras
use artificial intelligence algorithms to identify the objects in a picture. Experts
also anticipate a plethora of other cutting-edge applications for AI in the future,
such as intelligent electrical grids.
AI solves real-world issues by utilizing methods from economics, algorithm
design, and probability theory.
Mathematics supplies methods for modeling and resolving the ensuing
optimization issues, whereas computer science provides tools for creating and
developing algorithms.
Even though Alan Turing initially proposed an "imitation game" in the 19th
century to measure machine intelligence, the notion of artificial intelligence has
only recently been practical due to increasing access to data and processing
capacity for AI systems' training.
Week 1 Module
Course Title: AI 2121- AI IN ACTION: EXPLORING 2
APPLICATIONS AND IMPACT
QUARTER 1: INTRODUCTION TO AI
Thinking about what sets human intelligence apart from other species'
intelligence—our capacity to draw lessons from past experiences and apply
them to novel circumstances—will help you grasp the concept underlying
artificial intelligence. With more neurons than any other animal species, our
tremendous brainpower allows us to accomplish this.
Computers of today are not even close to replicating the organic brain network
of humans. However, machines have one big edge over us: they can evaluate
enormous volumes of data and experiences far more quickly than humans could
ever hope to.
You can concentrate on the most important activities using AI, and you can
make smarter judgments based on information gathered about a use case. It may
be applied to complicated tasks including figuring out a delivery truck's optimal
path, anticipating maintenance needs, and identifying credit card fraud. To put
it another way, AI can automate a lot of business tasks, freeing you up to focus
on your main business.
The goal of this field's research is to create machines that can automate jobs
requiring cognitive behavior.
Control, scheduling, and planning, answering consumer and diagnostic queries,
handwriting, speech recognition, natural language processing and perception,
and object manipulation and movement are a few examples.
The evolution of artificial intelligence across time
It is simple to forget that artificial intelligence is not a completely new area with
all of the attention it receives these days. There have been several distinct eras
in AI, characterized by the emphasis placed on either verifying logical theorems
or attempting to replicate human cognitive processes through neural modeling.
The study of artificial intelligence began in the late 1940s when computer
pioneers like Alan Turing and John von Neumann began to investigate how
computers might "think."
Week 1 Module
Course Title: AI 2121- AI IN ACTION: EXPLORING 3
APPLICATIONS AND IMPACT
QUARTER 1: INTRODUCTION TO AI
The research goal during the following 20 years was to apply artificial
intelligence to practical issues. Expert systems, which enable machines to learn
from experience and make predictions based on obtained data, are a result of
this progress. Although expert systems lack the complexity of human brains,
they may be taught to recognize patterns in data and come to conclusions based
on those findings. These days, manufacturing and medical both frequently
employ them.
A second significant turning point was the creation of robots like Shakey and
ELIZA in 1965, which allowed for the automation of basic human-machine
dialogue. Siri and Alexa are the result of the advancement of voice recognition
technology made possible by these early systems.
About 10 years passed during the first wave of enthusiasm surrounding artificial
intelligence. It resulted in important advancements in robotics, theorem
proving, and computer language design. However, it also sparked a reaction
against the field's exaggerated promises, and funding was drastically reduced
starting in 1974.
The late 1980s saw a resurgence in interest following ten years with little
advancement. Reports that computers were surpassing humans in "narrow"
activities like chess and checkers, as well as developments in computer vision
and speech recognition, were the main drivers of this resurgence. This time, the
focus was on developing systems that required less human participation to
comprehend and learn from real-world data.
These changes proceeded gradually until 1992, at which point interest started
to rise once more. First, the development of computer power and information
storage technologies contributed to an increase in interest in artificial
intelligence research. Then, significant advancements in computer technology
since the early 1980s propelled another significant surge in the mid-1990s. As
a result, there have been notable gains in performance on several important
Week 1 Module
Course Title: AI 2121- AI IN ACTION: EXPLORING 4
APPLICATIONS AND IMPACT
QUARTER 1: INTRODUCTION TO AI
benchmark issues, including image recognition, where computers are now
nearly as proficient as humans in certain areas.
In the first part of the twenty-first century, artificial intelligence made
tremendous strides. The creation of the self-learning neural network was the
first significant advancement.
References:
Kumar Abhishek. 2023. Introduction to Artificial Intelligence
Machine Learning
Machine Learning and Natural Language Processing
USA
Retrieved from https://www.red-gate.com/simple-talk/development/data-
science-development/introduction-to-artificial-intelligence/
Week 1 Module