Monograph
Monograph
CHARACTERISTICS OF INNOVATION:
• Innovation vary in extent or magnitude i.e. degree to which one deviates from the past.
   •   It is closely related to problem solving since generation & implementation of ideas for
       change never transpire without difficulty.
• A final characteristic is the impact of the change, the significance or range of its effects.
Examples:
   1. Evolution of Computers
   2. Evolution of Programming languages
Invention: An invention is a new thing invented by a person
Examples:
Examples:
                              Classification of innovation
Based on category:
       Product Innovation
       Service innovation
       Process Innovation
       Marketing Innovation
       Organisational Innovation
Product Innovation:
   New products. These are goods and services that differ significantly in their characteristics or
    intended uses from products previously produced by the firm. The first microprocessors and
    digital cameras are examples of new products using new technologies. The first portable
    MP3 player, which combined existing software standards with miniaturized hard-drive
    technology, was a new product combining existing technologies.
   New uses for products. The development of a new use for a product with only minor changes
    to its technical specifications is a product innovation. An example is the introduction of a
    new detergent using an existing chemical composition that was previously used as an
    intermediary for coating production only.
Service Innovation:
Process Innovation:
It includes significant changes in techniques, equipment's and software.
Example:
Administration re-organisation
Marketing innovation:
Organisational innovation
product innovation is the introduction of new or significantly improved goods or services. With
respect to the characteristics of goods or services, it involves improvements on technical
specifications, components and materials, ease of use and the incorporation of software and other
functional characteristics. According to the product or service innovation addresses the source of
change, which enables a competitive advantage; however, in services, innovation includes the
provision of a new service. report that product/service innovation is related to changes in the
products or services provided by an organization.
Process innovation is centred on improving the efficiency and effectiveness of the production
process. It involves changes in the way products and services are created and delivered to
customers.
Thus, marketing innovation addresses the implementation of new methods, with significant
changes in product development, packaging, promotion, positioning, and even in pricing.
Therefore, marketing innovation seeks to address the consumers’ needs, by the way new markets
are opening, the product repositioning of a company within the market, aiming to increase sales.
Organizational innovation is essential for companies that intend to follow strategic challenges,
since they result in improvements in the organization's management. Thus, organizational
innovation means the implementation of a new organizational method in a company's business
attitudes, such as the arrangement of the workplace and also external relationships. New methods
aid in the organization's routines and procedures, in addition to driving the work and practices
which facilitate learning and knowledge sharing within the company.
EXAMPLES:
AGRICULTURAL DRONES
    Farmers have begun to use agricultural drones adorned with cameras to improve the
     treatment of their crops. The drones allow farmers a unique perspective that previously-
     used satellite imagery could not provide. They help to expose issues with irrigation
     treatment, soil variation, and distressed plants at a much lower cost than methods like
     crop imaging with a manned aircraft. The success of the drones is made possible by
     technological advances in GPS modules, digital radios, and small MEMS sensors.
     Together, these advances allow farmers to bring greater precision to their craft in order to
     reap greater rewards.
 Agricultural drones allow relief for the modern day farmer. Drone technology can cut
  down labour requirements and reduce resource requirements (such as fresh
  water and pesticides
BRAIN MAPPING
 Neuroscientists have worked for decades to better understand how the brain functions.
  Recent advances in brain mapping technology have made that ambitious task easier. An
  international team of researchers at the Human Brain Project have created a three
  dimensional atlas of the brain. The maps resolution is fifty times better than previous
  efforts. The atlas creators digitally stitched together thousands of brain cross-sections.
  The map shows details up to 20 micrometers in size—the estimated size of many human
  cells. While this is a huge advancement, scientists still aim to create a map that shows
  details at 1 or 2 micrometers, rather than 20.
 The potential of 3-D printing technology has many people excited about new
  applications. But current printers have important limitations. Up until recently, most 3-D
  printers can only use plastic. A group of researchers at Harvard University, led by
  Jennifer Lewis, have started to develop new 3-D printer inks. Her team prints intricate
  objects using materials that are chosen based on their mechanical properties, electrical
  conductivity, or optical traits. Eventually new inks will enable a wider variety of
  functions, including artificial organ creation.
. MOBILE COLLABORATION
 Mobile devices enable improved access to existing desktop tools. The most common
  examples of mobile versions are email access on smartphones, instant messaging (IM)
  and Web conferencing tools.
Social Innovation
 Social innovations are new strategies, concepts, ideas and organizations that meet
  the social needs of different elements which can be from working conditions and
  education to community development and health — they extend and strengthen civil
  society.
       Examples of Social Innovation (Business)
    Unilever - Provides food and lifestyle products worldwide with a focus on environmental
     and social innovation.
 Code World Club - Mission: Give every child in the world the chance to learn code.
 Blind Square - Helping the visually impaired interact with their surroundings.
    Summo - A communication service for mobile clinics to empower nurses who work in
     mobile clinics in rural South Africa.
    PYBOSSA - With PyBossa you can distribute all kinds of tasks to thousands of
     volunteers.
BASED ON NOVELTY:
Radical Innovation:
    Radical innovation involves introducing new products or services that develop into major
     new businesses or spawn new industries, or that cause significant change in a whole
     industry and tend to create new values.
   1. Salesforce is one of the few companies to have launched a truly radical innovation. Its
      CRM system harnesses not only a new technology platform in the form of cloud
      computing but a new business model too. When the company launched back in 1999, its
      business model of selling the software as a service was truly innovative. And to this day
      the company continues to push the boundaries – it has been named by Forbes as one of
      the World’s Most Innovative Companies 4 years in a row.
   Incremental Innovation:
 Incremental innovation seeks to improve the systems that already exist, making them
  better, faster cheaper.
 Incremental innovation includes the modification, refinement, simplification,
  consolidation, and enhancement of existing products, processes, services, and production
  and distribution activities. The majority of innovations fall in this category.
    Examples can be found across industries, markets and regions for incremental
   innovation: Uber : For hailing a ride on a tap through mobile apps; Square for enabling
   individuals or small commercial businesses -ups to accept credit/debit card payments
   through their iOS or Android-based devices, and Flipkart for its ability to collect cash on
   delivery. All are great illustrations which provide huge benefits on account of
   incremental innovation.
   One of the most successful and recent examples of incremental innovation is the iPhone.
    While smartphones existed before Apple entered the market, it was mostly the
   incremental innovations of a larger touchscreen, the app store, various ease of use and an
   improved overall experience, which enabled the iPhone to be the first in making
   smartphones mainstream.
   Apple then created a whole new ecosystem which made the iPhone a preferred medium
   for accessing the internet, sending e-mail, finding directions, playing games, conducting
   online transactions and generally becoming a central part of our daily live
     Systematic Innovation:
   Systematic innovation is the process of methodically analyzing and solving problems
    with a primary focus on identifying the correct problem to be solved and then
    generating innovative solution concepts free from mental inertia.
     Steps involved in Systematic Innovation
   Developing a systematic innovation strategy from scratch is a challenge many companies
    are facing in today’s fast-paced world. This series of articles will focus on the steps a
    company should take in creating its own innovation practice, one that was developed
    specifically for its product, process or service.
   Step one – identify the company’s type of innovation.
   Step two is scouting and strategizing for innovation.
   Step three is committing to, and cultivating, an innovation environment.
   Step four is to innovate disruptively.
     The fifth, and final, step is to sustain innovation
     Example of Systematic Innovation
     For example, mainframe computers in the late 1970s led to the personal computer, a new
     paradigm of computing. Today, mobile phones are making conventional landline phones
     obsolete and Internet phones are in some cases replacing mobile phones.
• Combine
• Adapt
• Eliminate
• Reverse, Rearrange
Substitute
 Combine
Adapt
Minify
Magnify
Eliminate
             Chapter 2: Major Evolutions related to Computers
     Command User Interface – All the above mentioned three operating systems UNIX,
      MS-DOS and Linux used commands as the way of interaction.
     Graphic User Interface – With the invention of mouse, a new way of human-computer
      interface was devised – the Graphical User Interface(GUI). The GUI used graphic
      icons and pointing device like mouse to give instructions to computer.
  Initial computers to successfully use GUI were XeroAlto and XeroxStar in 1973. Apple
   commercially released GUI based Macintosh computers in 1984.
    Based on the success of Apple’s Macintosh computers Microsoft designed its GUI based
     operating system known as Windows. Starting with Windows 95, many successors of
     GUI based Windows Operating Systems are released by Microsoft. Latest in the series is
     Windows 8 released in 2012.
These computers used large number of vacuum tubes for circuitry and magnetic drums for
memory. They often required an entire room to be installed. They were very expensive and were
hence mainly used for scientific purposes. They also needed a lot of electricity, and thus
generated enormous heat. First generation computers could be programmed using machine
language, consisting of only 0’s and 1’s. These computers could solve only one problem at a
time. Input was fed using punched cards and paper tape, the output was generated on printouts.
Examples include UNIVAC (Universal Automatic Computer) and ENIAC (Electronic Numerical
Integrator and Calculator).
These computers manufactured using transistors, rather than vacuum tubes. Computers
manufactured using transistors were far superior to vacuum tubes. They were smaller, faster,
cheaper, and more energy efficient and reliable than their first-generation predecessors. The
programming of second generation computers was done using symbolic or assembly language
which allowed programmers to specify instructions in words. Input as fed using punched cards
and the output was generated on printouts. Second generation computers were first to store the
instructions in memory, which moved from magnetic drum to magnetic core technology.
The development of integrated circuit (IC) was the hallmark of the third generation of
computers. Several electronic components such as transistors, resistors, and capacitors were
miniaturized and placed on silicon chips, called IC’s, which drastically increased the speed and
efficiency of computers. IC’s were smaller, less expensive, more reliable and faster in operation,
consumed less power, and generated less heat than the components used earlier. These
computers had a few megabytes of main memory and magnetic disks that could store a few tens
of megabytes of data per disk drive. On the software front, high-level programming languages
such as COBOL and FORTRAN were standardized by ANSI (American National Standards
Institute).
The microprocessor launched the fourth generation of computers, with thousands of IC’s built
onto a single silicon chip. Fourth generation computers could fit in the palm of the hand.
Example the Intel 4004 chip, developed in 1971, consisted of all the components of the computer
on a single chip. The semiconductors memories which were used were very fast and HDDs also
became cheaper, smaller in size, and larger in capacity. For input, floppy disks were used to port
data and programs from one computer to another. During this period, many new operating
systems were developed, including MS DOS, Microsoft Windows, UNIX, and Apple’s
proprietary operating system.
In 1981, IBM introduced the first PC that was specifically meant for the home user, and in 1984
Apple introduced the Macintosh. As these small computers became more powerful, they could
be linked together to form networks, which eventually led to the development of the Internet and
other distributed systems. Fourth generation also saw the development of graphical user interface
(GUIs). At the same time, several word processing packages, spreadsheet packages, and graphics
packages were introduced, thereby making the computers a powerful tool for everyone.
Fifth generation computers are completely based on the new concept of artificial intelligence.
Although, such computers are still in development, there are certain applications such as voice
recognition that are widely being used today. Parallel computing and superconductor technology
have made AI a reality. In this generation, the aim is to develop devices that respond to natural
language input and are capable of learning and self organization. AI touches the following areas
among others: Gaming, Expert Systems, Natural Languages, Neural Networks, Robotics etc. The
current status is that no computer is able to completely simulate human behavior.
The first generation of mobile network was deployed in Japan by Nippon Telephone and
Telegraph company (NTT) in Tokyo during 1979. In the beginning of 1980s, it gained popularity
in the US, Finland, UK and Europe. This system used analogue signals and it had many
disadvantages due to technology limitations.
Disadvantages of 1G system
Second generation of mobile communication system introduced a new digital technology for
wireless transmission also known as Global System for Mobile communication (GSM). GSM
technology became the base standard for further development in wireless standards later. This
standard was capable of supporting up to 14.4 to 64kbps (maximum) data rate which is sufficient
for SMS and email services.Code Division Multiple Access (CDMA) system developed by
Qualcomm also introduced and implemented in the mid 1990s. CDMA has more features than
GSM in terms of spectral efficiency, number of users and data rate.
Third generation mobile communication started with the introduction of UMTS – Universal
Mobile Terrestrial / Telecommunication Systems. UMTS has the data rate of 384kbps and it
support video calling for the first time on mobile devices.After the introduction of 3G mobile
communication system, smart phones became popular across the globe. Specific applications
were developed for smart phones which handles multimedia chat, email, video calling, games,
social media and healthcare.
Disadvantages of 3G systems
4G systems are enhanced version of 3G networks developed by IEEE, offers higher data rate and
capable to handle more advanced multimedia services. LTE and LTE advanced wireless
technology used in 4th generation systems. Furthermore, it has compatibility with previous
version thus easier deployment and upgrade of LTE and LTE advanced networks are possible.
Simultaneous transmission of voice and data is possible with LTE system which significantly
improve data rate. All services including voice services can be transmitted over IP packets.
Complex modulation schemes and carrier aggregation is used to multiply uplink / downlink
capacity. Wireless transmission technologies like WiMax are introduced in 4G system to
enhance data rate and network performance.
Disadvantages of 4G system
5G will be using advanced technologies to deliver ultra fast internet and multimedia experience
for customers. Current LTE advanced networks will transform into supercharged 5G networks in
future. In order to achieve higher data rate, 5G technology will use millimeter waves and
unlicensed spectrum for data transmission. Complex modulation technique has been developed
to support massive data rate for Internet of Things.
3rd Generation Programming        The third generation programming languages were more
language(3GL)                     normal English language like and hence easier for
                                  programmers to understand. The 3GLs are thus also called
                                  High Level Languages(HLLs). Many 3GLs are ALGOL,
                                  COBOL, Fortran, BASIC, C, PASCAL etc.
4th Generation Programming        These programming languages are closer to natural language
language(4GL)                     than 3GLs. Most popular 4GL is SQL(Structure Query
                                  Language).
5th Generation Programming        The fifth generation programming languages are used mainly
language(5GL)                     in Artificial Intelligence research. Some 5GLs are Prolog,
                                  OPS5, Mercury etc.
Display Description
CRT                          First developed in 1897, CRTs were used in television sets and in
                             early computers as the display screen.
(Cathode Ray Tube)
(1897)
LCD                            •   Suggested in 1963 for display screens
(Liquid Crystal Display)       •   LCDs first found application in watches and calculators.
(1963)                         •   For computer displays, LCDs were permanently used in
                                   1990s only.
Plasma Monitor              From the first prototype release in 1964, they found space as display
                            screens because of their long life and wide range of contrasts and
(1964)
                            colors.
The electronic revolution started from 18`th -19th century and changed the human life forever.
The associated data storage technologies also evolved with computer evolution. Let us learn how
these storage technologies evolved over time.
Magnetic Drum     •   Invented in Austria, the magnetic drum was an early form of
                      computer memory.
    (1932)
                  •   Electromagnetic pulse was stored by changing the magnetic
                      orientation of ferromagnetic particles on the drum.
Magnetic Tape     •   Magnetic tape used magnetic pulses to store data over magnetized
                      tapes.
    (1951)
                  •   They had the capacity of storing as many as 10,000 punch cards.
Hard Disk Drive   •   It is a set of magnetized circular platters that store data as
                      magnetic dots.
    (1958)
                  •   Introduced as data storage for IBM computers, these devices were
                      not very popular in 1960s and 1970s due their immense size and
                      price.
Optical Storage   •   CD (Compact Disks) were created in 1980s by Philips and Sony
    Media             as replacement of aging floppy disks.
  CD (1980s),     •   DVD (Digital Video Disks), created in 1990s, were the next
 DVD s(1990s)         evolution of CD.
                           •   A DVD could store more data of upto 8-10 CDs.
   Blue ray Disks      The next-generation optical disk can store enormous amount of data in its
                       storage space of 400 nanometers.
        (2000)
     The Cloud         Similar to how data is stored on the internet, cloud storage allows data to
                       be stored on multiple servers, which are generally hosted by a third party.
   (21st Century)
In recent years IPR management has become intrusively important, both for industries and
research institutes. The main categories of IPR include:
   1.   Patents
   2.   Copyright
   3.   Trademark
   4.   GI
   5. Design Rights etc.
IPR are the rights given to a human so that his process of creating product or product itself is not
copied by anyone.
It is to protect the intellectual property of a person. Intellectual property arises from the intellect
of a human. It can be a product created by a human, improvement in the process of creating the
product or some invention. Property and intellectual property are not the same; they belong to
different categories of product. It can be easily understood by the following table:
Intellectual property rights can be assigned through patents, copyright and trademark. IPR
provides incentives for creativity and disclosure of information. It plays significant role in
encouraging innovation on form of product or process development leading to technological
advancement.
It has become increasingly important facet to the success and longevity of any organization and
also for economic growth of the country. In this regard ni-MSME is organizing a 3 day training
program for faculty members of university, research scholars and final year UG and PG students.
The program is intended to bring awareness about importance of IPR and to encourage faculties
to find IP for their research, inventions and innovative projects. The sessions are handled by the
eminent IP experts from technical and legal backgrounds with proven track record.
On every April 26, WIPO Day is celebrated worldwide. Many intellectual law firms, both big
and small are spread in world in countries like India, USA, Chicago, UK etc. and they are
helping creators and inventors for their product.
COPYRIGHT:-
It is a right for protection which is given to creators or authors for their original and unique
expression of ideas. This law protects expression of an idea and not the idea itself.
Examples are literary, films, dramatic, musical, artistic, sound, recording, computer software,
websites, etc. (photos can be added)
Some of the things that are not covered in copyright are ideas, concepts, facts, discoveries,
recipes, name, title, phrase, etc.
Properties of Copyright:-
It was grounded in U.S. construction, current copyright law is Copyright Act of 1976.
Registration Procedure:-
      The creator or author needs to fill a single application form and pay some non refundable
       fee.
      It is not compulsory to send the original work.
      If you have created more than one work of same category, then also you have to file for
       different copyrights for each wor
DURATION OF COPYRIGHT:-
Copyright is generated for authors lifetime and in addition 50 years more from the end of
calendar year in which author dies.
The protection of copyright always expires on 31st December of the last calendar year of
protection.
PATENT:-
Patent Act 1970 is extended to the whole of India. Patent is one of the type of exclusive IPR
which is granted for an invention process or product which offers new technical solution.
Those inventions are patentable which are novel, useful, not obvious and finds out to be
patentable subject matter.
       Patent concept is not new from the time of ruler and kingdoms it is promoted. The rulers
       motivated their people to bring new discoveries and make them known for it such as in
       refinement, in culinary dishes rights were reserved for any new discovery.
REGISTRATION PROCEDURE
      For all type of inventions the term of patent from the filling date is 20 years.
      Date of patent is important as if same idea is filed by two persons than the person to file
       the patent first will get the rights.
RENEWAL FEE:-
TRADEMARK:-
       A symbol, word, design, color, sound, logo, phrase or other device that is used to
        distinguish and identify the goods or sources from one party to another.
       It was established in 1940 and presently administers trademark act 1999.
       Trademark is the concept which is started by from very old time.
       In 1266- King Henry III of England asked his all bakers to make some distinctive mark
        on the breads which they made and sold in market.
       King Edward I of England ordered all the workers of gold and silver to mark some type
        of stamp on their produced          otherwise they were sentenced to death.
             th
        In 19 century Bars Brewery named company in UK got the first trademark.
    TM
    SM
    R
TM:- Trademark symbol, this symbol signifies that application has been filed for product.
SM:- Service symbol- signifies that application has been filed for some services.
REGISTRATION PROCEDURE:-
DURATION OF TRADEMARK:-
   •   A review paper is not a "term paper" or book report. It is not merely a report on
       some references you found. Instead, a review paper synthesizes the results from several
       primary literature papers to produce a coherent argument about a topic or focused
       description of a field.
   •   You should read articles from one or more of these sources to get examples of how
       your paper should be organized.
   •   A key aspect of a review paper is that it provides the evidence for a particular point of
       view in a field. Thus, a large focus of your paper should be a description of the data that
       support or refute that point of view. In addition, you should inform the reader of the
       experimental techniques that were used to generate the data.
• The emphasis of a review paper is interpreting the primary literature on the subject.
   •   Experimental Evidence: Describe important results from recent primary literature articles
       and
   •   Explain how those results shape our current understanding of the topic.
•   Mention the types of experiments done and their corresponding data, but do not repeat
    the experimental procedure step for step.
•   Use figures and/or tables to present your own synthesis of the original data or to show
    key data taken directly from the original papers.
• Conclusion
 Keep it brief.
• Literature Cited
The new product planning is the function of the top management personnel and specialists drawn
from sales and marketing, research and development, manufacturing and finance.
This group considers and plans new and improved products in different phases, as given below:
1. Idea generation (Idea Formulation)
3. Concept Testing
4. Business analysis
5. Product development
6. Test market
1. Idea Generation:
The focus in this first stage is on searching for new product ideas. Few ideas generated at this
stage are good enough to be commercially successful. New product ideas come from a variety of
sources. An important source of new product ideas is customers. Fundamentally, customer needs
and wants seem to be the most fertile and logical place to start looking for new product ideas.
This is equally important for both consumers and industrial customers.
Product planning starts with the creation of product ideas. The continuous search for new
scientific knowledge provides the clues for meaningful idea formation.
It means critical evaluation of product ideas generated. After collecting the product ideas, the
next stage is screening of these ideas. The main object of screening is to abandon further
consideration of those ideas which are inconsistent with the product policy of the firm. The
product ideas are expected to be favourable and will give room for consumer satisfaction,
profitability, a good market share, firm’s image.
All the ideas cannot be accepted, because certain product plans need huge amount of
investments, for certain plans raw materials may not be available, certain plans may not be
practicable. Many of the ideas are rejected on account of many reasons and thus eliminate
unsuitable ideas. Only promising and profitable ideas are picked up for further investigation.
3. Concept Testing:
After the new product idea passes the screening stage, it is subjected to ‘concept testing’.
Concept testing is different from test marketing, which takes place at a later stage. What is tested
at this stage is the ‘product concept’ itself-whether the prospective consumers understand the
product idea, whether they are receptive towards the idea, whether they actually need such a
product and whether they will try out such a product if it is made available to them.
In fact, in addition to the specific advantage of getting the consumers’ response to the product
idea, this exercise incidentally helps the company to bring the product concept into clearer focus.
Concept testing helps the company to choose the best among the alternative product concepts.
Consumers are called upon to offer their comments on the precise written description of the
product concept, viz, the attributes and expected benefits.
More precise estimates of environmental and competitive changes that may influence the
product’s life cycle or its replacement or repeat sales are also needed to develop and launch a
product? A complete cost appraisal is necessary besides judging the profitability of the project.
Market analysis involves a projection of future demand, financial commitment and return.
Financial specialists analyze the situation by applying break-even analysis, risk analysis.
Business analysis will prove the economic prospects of the new product.
5. Product Development:
The idea on paper is converted into product. The product is shaped corresponding to the needs
and desire of the buyers. Product development is the introduction of new products in, the present
markets. New or improved products are offered by the firm to the market so as to give better
satisfaction to the present customers. Laboratory tests technical evaluations are made strictly.
6. Test Marketing:
By test marketing, we mean, what is likely to happen, by trial and error method when a product
is introduced commercially into the market. These tests are planned and conducted in selected
geographical areas, by marketing the new products. The reactions of consumers are watched.
It facilitates to uncover the product fault, if any, which might have escaped the attention in the
development stage. By this, future difficulties and problems are removed. This type of pre-
testing is essential for a product before it is mass produced and marketed. Sometimes, at this
stage, management may take decision to accept or reject the idea of marketing products.
Designing the programme for test marketing involves making a number of decisions:
1. Where and in how many markets should test be carried out?
3. What to test?
ii. a region or
3. Differentiation of product
5. Effective promotion
2. Shorter PLC
3. Higher expenditure
4. Fragmented markets
5. Inappropriate incentives
7. Non-cooperation of staff
➭ Communication elements
- Routers, switches,
➭ Communication links
- optic fiber
- coaxial cable
- twisted pair
➭ Topologies
Types of Network
  •   Depending on one’s perspective, we can classify networks in different ways:
          •   Based on transmission media: Wired (UTP, coaxial cables, fiber-optic cables) and
              Wireless
   Based on Size
    Mainly Classified into three kinds: LAN, WAN, MAN
 MAN:
 Client Server
o Network Servers
                              Usually have more processing power, memory and hard disk space
                               than clients
                              Run Network Operating System that can manage not only data, but
                               also users, groups, security, and applications on the network
A Network Topology is the arrangement with which computer systems or network devices are
connected to each other. Topologies may define both physical and logical aspect of the network.
Both logical and physical topologies could be same or different in a same network.
Point-to-Point
Point-to-point networks contains exactly two hosts such as computer, switches or routers,
servers connected back to back using a single piece of cable. Often, the receiving end of one
But the end hosts are unaware of underlying network and see each other as if they are connected
directly.
Bus Topology
In case of Bus topology, all devices share single communication line or cable.Bus topology may
have problem while multiple hosts sending data at the same time. Therefore, Bus topology
either uses CSMA/CD technology or recognizes one host as Bus Master to solve the issue. It is
one of the simple forms of networking where a failure of a device does not affect the other
devices. But failure of the shared communication line can make all other devices stop
functioning.
Both ends of the shared channel have line terminator. The data is sent in only one direction and
as soon as it reaches the extreme end, the terminator removes the data from the line.
Star Topology
All hosts in Star topology are connected to a central device, known as hub device, using a point-
to-point connection. That is, there exists a point to point connection between hosts and hub. The
As in Bus topology, hub acts as single point of failure. If hub fails, connectivity of all hosts to
all other hosts fails. Every communication between hosts, takes place through only the hub.Star
topology is not expensive as to connect one more host, only one cable is required and
configuration is simple.
Ring Topology
In ring topology, each host machine connects to exactly two other machines, creating a circular
network structure. When one host tries to communicate or send message to a host which is not
adjacent to it, the data travels through all intermediate hosts. To connect one more host in the
existing structure, the administrator may need only one more extra cable.
Failure of any host results in failure of the whole ring.Thus, every connection in the ring is a
point of failure. There are methods which employ one more backup ring.
Mesh Topology
In this type of topology, a host is connected to one or multiple hosts.This topology has hosts in
point-to-point connection with every other host or may also have hosts which are in point-to-
 Full Mesh: All hosts have a point-to-point connection to every other host in the network.
Thus for every new host n(n-1)/2 connections are required. It provides the most reliable
 Partially Mesh: Not all hosts have point-to-point connection to every other host. Hosts
connect to each other in some arbitrarily fashion. This topology exists where we need to
Also known as Hierarchical Topology, this is the most common form of network topology in
use presently.This topology imitates as extended Star topology and inherits properties of bus
topology.
This topology divides the network in to multiple levels/layers of network. Mainly in LANs, a
network is bifurcated into three types of network devices. The lowermost is access-layer where
computers are attached. The middle layer is known as distribution layer, which works as
mediator between upper layer and lower layer. The highest layer is known as core layer, and is
central point of the network, i.e. root of the tree from which all nodes fork.
All neighboring hosts have point-to-point connection between them.Similar to the Bus
topology, if the root goes down, then the entire network suffers even.though it is not the single
point of failure. Every connection serves as point of failure, failing of which divides the
Internetworking Devices
1. Repeater – A repeater operates at the physical layer. Its job is to regenerate the signal over the
same network before the signal becomes too weak or corrupted so as to extend the length to
which the signal can be transmitted over the same network. An important point to be noted about
repeaters is that they do not amplify the signal. When the signal becomes weak, they copy the
signal bit by bit and regenerate it at the original strength. It is a 2 port device.
2. Hub – A hub is basically a multiport repeater. A hub connects multiple wires coming from
different branches, for example, the connector in star topology which connects different stations.
Hubs cannot filter data, so data packets are sent to all connected devices. In other
words, collision domain of all hosts connected through Hub remains one. Also, they do not have
intelligence to find out best path for data packets which leads to inefficiencies and wastage.
Types of Hub
    Active Hub :- These are the hubs which have their own power supply and can clean ,
       boost and relay the signal along the network. It serves both as a repeater as well as
       wiring center. These are used to extend maximum distance between nodes.
    Passive Hub :- These are the hubs which collect wiring from nodes and power supply
       from active hub. These hubs relay signals onto the network without cleaning and
       boosting them and can’t be used to extend distance between nodes.
3. Bridge – A bridge operates at data link layer. A bridge is a repeater, with add on functionality
of filtering content by reading the MAC addresses of source and destination. It is also used for
interconnecting two LANs working on the same protocol. It has a single input and single output
port, thus making it a 2 port device.
Types of Bridges
      Transparent Bridges :- These are the bridge in which the stations are completely
         unaware of the
         bridge’s existence i.e. whether or not a bridge is added or deleted from the network ,
         reconfiguration of
         the stations is unnecessary. These bridges makes use of two processes i.e. bridge
         forwarding and bridge learning.
      Source Routing Bridges :- In these bridges, routing operation is performed by source
         station and the frame specifies which route to follow. The hot can discover frame by
         sending a specical frame called discovery frame, which spreads through the entire
         network using all possible paths to destination.
4. Switch – A switch is a multi port bridge with a buffer and a design that can boost its
efficiency(large number of ports imply less traffic) and performance. Switch is data link layer
device. Switch can perform error checking before forwarding data, that makes it very efficient as
it does not forward packets that have errors and forward good packets selectively to correct port
only. In other words, switch divides collision domain of hosts, but broadcast domain remains
same.
5. Routers – A router is a device like a switch that routes data packets based on their IP
addresses. Router is mainly a Network Layer device. Routers normally connect LANs and
WANs together and have a dynamically updating routing table based on which they make
decisions on routing the data packets. Router divide broadcast domains of hosts connected
through it.
6. Gateway – A gateway, as the name suggests, is a passage to connect two networks together
that may work upon different networking models. They basically works as the messenger agents
that take data from one system, interpret it, and transfer it to another system. Gateways are also
called protocol converters and can operate at any network layer. Gateways are generally more
complex than switch or router.
7. Brouter – It is also known as bridging router is a device which combines features of both
bridge and router. It can work either at data link layer or at network layer. Working as router, it is
capable of routing packets across networks and working as bridge, it is capable of filtering local
area network traffic.
BASIS FOR
                                  INTERNET                           INTRANET
COMPARISON
IoT(Internet of Things) refers to the concept of connecting various devices to internet to make
them communicable with each other over the internet. IoT effectively uses emerging and existing
technology for networking and sensing. It enables to connect everyday things embedded with
software, electronics and sensors to the internet which in turn gather and exchange data. The
concept of Machine to Machine (M2M) communication is backbone of IoT, which refers to
machines connecting and communicating to each other without human interference. M2M is the
process of collaborating with cloud, managing it, and collecting data. The IoT integrates greater
compute capabilities and makes use of data analytic's to perform meaningful extraction of
information.
Any device is considered as IoT device if has a sensor attached to it and has the capability to
transfer data from one device to another through the internet.IoT devices consist of softwares,
wireless sensors, actuators and computer devices. They are connected to a particular device that
operates with the help of internet, enabling the data to transfer among the devices or people.
For different IoT architecture, the working of IoT is different. But the main concept of their
workingis similar. The working process of IoT initiates with the devices, like mobile phones,
smart watches, electronic appliances, which communicate securely with the IoT platform and
then platforms collect and analyze the data from all multiple devices and platforms and transfer
the significant data with applications to devices.
Example : A house, where we can connect our home utilities such as air conditioner,
light,refrigerator etc. with each other and all these things are managed at a common platform. As
we have a platform, we can connect our car, track its fuel meter, speed level, and also track the
location of the car. Based on my preference, I can set the room temperature. For example, if I
love the room temperature to be set at 25 or 26-degree Celsius when I reach back home from my
office, then according to my car location, my AC would start before 10 minutes I arrive at home.
This can be done through the Internet of Things (IoT).
1.1 Key Features of IOT : Fig 6.1 summarizes the key features of IOT and these features are
explained as below purpose.
Connectivity : From the definition of IOT , it is clear that IOT works on the basis of connecting
of various devices so that they can communicate to achieve a specific purpose.
Analyzing : After connecting various devices , the collected data is to be analyzed so that it can
be converted to smart output.
Sensing : Sensors are used to sense data which aid IOT to behave like active network .
Active management : IOT provides active product , service and content management.
Devices : IOT uses small devices to give versatile and scalable performance.
                                          Features of IOT
                                            Connectivity
                                              Analyzing
                                        Artificial Intelligence
                                               Sensing
                                        Active Management
                                               Devices
                                               Fig 6.1
“ Case Study On Smart Cities” ,This study has been done by Department for Business innovation
and Skills, London in October 2013.
In the past decade, the evolution and rapid adaption of information technology, sensing, big data
and information-based products and services has shifted the way in which people live in cities.
Smart phones make anytime anywhere access to information, services and communication a
baseline expectation of many citizens, who have adapted almost seamlessly to this new way of
living.Technology vendors are adopting that ‘smart city technologies’ of hiked sensing,
information management and control could improve the efficiency, quality and cost of providing
city services. At the same time, while city governments make this transition to online service
provision, they must ensure that those who do not have access to this technology are not left
behind. The public sector faces particular challenges when responding to the opportunities that
the ‘smart city’ and private sector innovators might bring. They struggle to quantify the impact
of novel, disruptive technologies, which can make investment challenging. The organisational
structure and culture of City Councils can block cross-departmental long-term strategic thinking
about ICT, and the required organisational changes can be difficult to implement. While all cities
are unique, they also have common objectives and face common challenges. Study of six
cities(Chicago,Rio de janeiro,Stockholm,Boston,Barcelona,Hong Kong) focused on how these
cities are addressing their challenges, and how they are adapting their organisations to deliver
new digital services to their citizens.
1.2.1 Findings
This study has highlighted common themes in cities adopting smart approaches to city
management.
 • Leadership models. A strong political mandate for action supported by a clear vision of the
role of smart in the city supports strategic alignment and investment in technology across
departments. This should be inclusive of grass-roots activities (such as individual department
pilots, or local SME innovation) to ensure the longevity and sustainability of the programme.
 • Mechanisms for managing risk and introducing innovation. Cities can manage the risk
associated with innovation through both organisational structure and funding models.
Organisationally, they can create a function whose role is to act entrepreneurially, collaborate,
and pilot new ideas. This function may be supported by capital that is not drawn from the tax
payer (e.g. through private grants from foundations etc.), allowing funds to be used more flexibly
for innovative projects where the outcomes are less certain.
• In order to support cross departmental working for smart cities, many cities are choosing to
place the smart city vision in a department that already works horizontally across city siloes .
Alternatively they are adding in new groups to their organisational structure that are able to act
as umbrellas for a host of existing activities. The aim of this is to ensure that all departments are
working together towards an aligned vision.
 • Procurement policy often makes working with SMEs challenging for local governments,
which can act against smart city aspirations. This can be combated by placing a threshold on the
size ofprojects that need to go through formal procurement, or supporting small companies
through the procurement process.
• Smart cities no longer place city governments as the top-down drivers of development in the
city, but instead they act as one player in an ecosystem. In response to this, smart city strategies
should represent the needs and capabilities of a variety of city stakeholders. In particular,
relationships with community groups, the private sector and universities are core to developing
well-rounded and sustainable initiatives.
• Data analytics can be leveraged to plan and deliver local services better.
 • While smart city services and the move to e-government approaches offers significant
advantages for citizens, special attention must be paid to ensure that the opportunities are equally
accessible by all. Providing vulnerable citizens with access to internet, devices and training
around the use of digital services as well as ensuring the transparency of and access to
government data is essential in ensuring that certain citizen groups are not marginalised by the
move to smart city approaches.
2. Cyber Security : As We are living in a digital world so our private information is more
unsafe than ever before. In present era,world is networked together, from internet banking to
government infrastructure, where computers and other devices are used to store data. A portion
of that data can be vulnerable information for which unauthorized access or exposure could have
negative consequences.
As cyber-attacks are increasing, so organizations which deal with information related to health,
financial records or national security, require to take steps to safegaurd their sensitive business
and personal information.
Cyber attacks can be classified into two categories : web based attacks and system based attacks.
Web based attacks are the attacks which occur on a website or web applications e.g. phising,
Denial of service etc. System based attacks are the attacks which are intended to compromise a
computer or a computer network e.g. virus, worm etc.
We can also define cybersecurity as the set of principles and practices designed to protect our
computing resources and online information against cyber attacks. Due to the heavy dependency
on computers in a modern industry that store and transmit an abundance of confidential and
essential information about the people, cybersecurity is a vital function and required insurance of
many businesses.
2.1 Goals of Cyber Security : Main aim of Cybersecurity is to protect information from being
compromised ,attacked or stolen. Cybersecurity can be measured by any of three goals-
       Confidentiality, integrity, availability (CIA) triad is the basis of all security programs.
       The CIA triad is considered as a security model that is designed to direct policies for
       information security within an organization or company. These three elements of the triad
       are considered as most crucial components of security.
       The CIA criteria is referred by most of the organizations and companies when they
       install a new application, creates a database or when guaranteeing access to some data.
       For data to be completely secure, all of these security goals must be effective.
1. Confidentiality
2. Integrity
       Integrity means the methods for ensuring that data is real, accurate and safeguarded from
       unauthorized user modification. It is the property that information has not be changed in
       an unauthorized way, and that source of the information is genuine.
3. Availability
1. GDPR : General Data Protection Regulation covers only EU companies for GDPR
compliance. Hackers using non-compliance with GDPR to their advantage by blackmailing
companies that don,t meet these regulations.
2. Attacks via Compromised IoT devices : Attacks by Botnets, DDos , ransomware has been
increased by 2018.
3. Cloud security issue : There are many issue for cloud security which can not be ignored by
cloud providers as it is being used by IoT. They are not completely aware of complexities
involved in security of cloud data.
4.Attacks based on Machine learning and AI : There are chances that hackers may use innovative
solutions for performing sophisticated attacks in spite of AI/ML security tools.
5. Fileless Malware : Non-Malware attacks are also serious problem and most of organizations
are unable to deal with this type of attack. These attacks exploit windows sensitiveness and
execute payload in memory which is harder to detect.
                                            Unit-III
The efficient computing services over the network provided with the help of mainframe time-
sharing technology in 1960,      in which authors describes the concept of public utility by
centralized computing to connect multiple users over the network. However, the mainframes
were tough to provision or scale up-front due to high infrastructure costs still acceptable
performance with efficient utilization of computing resources to users. The problem using this
technology was that user’s didn’t get the full attention over the performance of loaded
applications because at one moment number of users utilized the mainframe. To solve this
problem personal computers introduced with the idea of full control over computing resources,
even resource not effectively utilized. The personal computers became affordable with the
massive change in the semiconductor industry this change discarded the need of mainframes in
business. From this new challenge of data-sharing was introduced. To address this issue client-
server systems used with the support of centralized data management. The increasing computing
demands at business level adopted the Internet for transforming simple client-server into
complex. As a conclusion, the management cost and complexity of Information Technology (IT)
infrastructure have rising towards sky. For many IT industries, the long-standing dream has been
to adopt feasible computing model named Cloud Computing that may reduces the management
cost and complexity with high operation efficiency. Long before the cloud computing model was
introduced, services provided to customers via internet through software suppliers called
Application Service Provision (ASP). ASP was the first platform of service delivery formed with
the concept of communication and computing in the mid 1990s. However, complicated initial
installation and configuration at customer end and absence of multitenancy behavior failed the
ASP model. Consequently, ASP lacked the benefits that cloud computing enjoys of being
flexible and scalable.
Cloud computing has received the significant attention in Information and Communication
Technology industry with the help of on-demand provisioning concept of computing resources to
consumers using a pay-as-utility model. There are numerous definitions of cloud computing to
be originated from multiple sources. The term “cloud computing” refers as a form of efficient
and flexible usage of application services that delivered just in-time on demand over the internet
and paid per usage. The National Institute of Standards and Technology (NIST) has
described the standard definition of cloud computing that enables convenient, on-demand
basis network access from shared pool with configurable computing resources like servers,
applications, storage, network interfaces and services that are rapidly allocated and de-
allocated with negligible service provider interaction and management efforts. Cloud
computing is defined with five attributes: scalability, multitenancy, rapid elasticity, pay as you
go and self-provisioning of resources . Furthermore, cloud also provides a convenient way for
accessing stored data on the cloud by users from anywhere and anytime on any device with
internet connection. For cloud formation deployment model is used.
   Organization or consumers can use software online on usage basis without buying the
software. All updates or maintenance handled by Cloud service provider referred as Software as
a Service (SaaS) e.g. Google Apps. If a Platform is given to consumers for developing,
designing, implementing, debugging and deploying their applications on cloud then simply it is
called as Platform as a Service (PaaS) e.g. Salesforce. Infrastructure as a Service (IaaS) basically
gives only infrastructural based resources like storage space, Network bandwidth, servers etc e.g.
Amazon Web Services.
The best part of cloud computing is that it provides more flexibility than its previous
counterparts. It has shown many benefits to enterprise IT world. Cost optimization among them
is the frontrunner, since the principle of cloud is “pay as per use”. The other benefits are
increased mobility, ease of use, utmost apt utilization of resources, portability of application, etc.
This means users will be able to access information from anywhere at any time easily without
wasting the underlying hardware resources ideal or unused. Due to its benefit, today’s computing
technology has witnessed a vast migration of 5 organizations from their traditional IT
infrastructure to cloud.
• Cost Savings
• Remote Working
• Efficiency
• Flexibility
• Future Proofing
• Morale Boosting
There are certain services and models working behind the scene making the cloud computing
feasible and accessible to end users. Following are the working models for cloud computing:
    1. Deployment Models
    2. Service Models
Deployment Models
Although cloud computing has emerged mainly from the appearance of public computing
utilities. In this sense, regardless of its service class, a cloud can be classified as public, private,
community, or hybrid based on model of deployment as shown in Fig
1. PUBLIC CLOUD
The public cloud allows systems and services to be easily accessible to the general public. Public
cloud may be less secure because of its openness.
2. PRIVATE CLOUD
The private cloud allows systems and services to be accessible within an organization. It is more
secured because of its private nature.
    3. COMMUNITY CLOUD
   The community cloud allows systems and services to be accessible by a group of
   organizations.
4. HYBRID CLOUD
The hybrid cloud is a mixture of public and private cloud, in which the critical activities are
performed using private cloud while the non-critical activities are performed using public cloud.
Cloud computing services are divided into three classes, according to the abstraction level of the
capability provided and the service model of providers, namely:
1. Infrastructure as a Service
2. Platform as a Service
In addition to infrastructure-oriented clouds that provide raw computing and storage services,
another approach is to offer a higher level of abstraction to make a cloud easily programmable,
known as Platform as a Service (PaaS).. Google AppEngine, an example of Platform as a
Service, offers a scalable environment for developing and hosting Web applications, which
should be written in specific programming languages such as Python or Java, and use the
services’ own proprietary structured object data store.
3. Software as a Service
Applications reside on the top of the cloud stack. Services provided by this layer can be accessed
by end users through Web portals. Therefore, consumers are increasingly shifting from locally
installed computer programs to on-line software services that offer the same functionally.
Traditional desktop applications such as word processing and spreadsheet can now be accessed
as a service in the Web.
Fig.1 Cloud Computing Stack
The enterprise world adopts the cloud computing due to number of important factors like
scalability, elasticity, pay-as-you-go model, resource pooling, virtualization etc . From these
factors few are considered that provide key incentive for an organization to use cloud computing
follow as:
   i)        Elasticity: Defined as the ability to scale computing capability up-down on user
             demand basis.
   ii)       Pay-As-You-Model: The cost of resource usage is measured on the basis of utility.
             Public cloud allow companies to avoid investment on infrastructure e.g. Amazon.
             With this model companies need not to purchase computing resources. This model is
             feasible for smaller organizations or start-ups that cannot afford too much cost at the
             beginning of their business journey.
   iii)      Self-service: Resource provisioning dynamically provided to consumers on demand
             basis without interacting personally with Cloud Service Provider (CSP).
   iv)       Extensive network access: Cloud provides platform-independent access to consumers
             for resources over the network from anywhere in the globe either on laptops or on
             phones (thin/thick clients).
Future Trends in Cloud Computing
Cloud computing is changing businesses in many ways. Whether it is the way they store their
data or how they protect their secure information, cloud computing is benefitting all businesses
in every sector.
Smart businesses are always looking for the most innovative ways to improve and accomplish
their business objectives. When it comes to cloud technology, more and more businesses are
realizing the benefits this technology can provide them and are beginning to seek more cloud
computing options to conduct their business activities. With the multitude of future technology
trends in cloud computing, companies in every sector are benefitted by the opportunities cloud
technology offers.
Cloud computing future growth all began when the growth of infrastructure as a service, IaaS,
and platform as a service, PaaS, expanded the number of cloud solutions available in public and
private sectors. As IaaS and PaaS continue to be used worldwide to achieve diverse goals, we
will see these solutions as the most deployed cloud services around the world. Cisco predicts that
SaaS, software as a service, solutions will account for more than 60% of all cloud-based
workloads this year. They also predict that PaaS and IaaS solutions will increase throughout
2018. Any business looking to simplify their operations and make services easier to access for
customers will most likely move toward cloud services solutions.
A huge aspect affecting the future of cloud computing is the amount of storage cloud computing
will offer companies and individuals. This growth is because many businesses are adopting cloud
technology as a huge part of doing business. It is predicted that providers will bring more data
centers online with larger-capacity storage equipment throughout this year. Cisco estimates the
storage capacity of the cloud will double this year alone. With this increased storage, more
businesses will be able to store large data sets and perform analytics using cloud computing.
Being able to perform analytics on this massive amount of data will allow companies to gain
valuable insights into customer behavior, human systems, and strategic financial investments,
just to name a few.
Most of us have heard the buzzword, internet of things, IoT. With continuous innovations in real-
time data analytics and cloud computing, we will see the newest technology buzzword, internet
of everything, be used more often as 2018 progresses. Cloud computing will play a major role in
the way IoE develops as it relies heavily on machine to machine communications, data,
processes and the way humans interact with things in their environment. A major trend we will
see this year is the significant role cloud computing will play in IoE’s ability to simplify all
interactions.
The quality of the internet has been getting immensely better every year since it was created.
2018 is expected to be no different, as the amount of data generated and stored around the world
increases. Customers today already expect high-quality, fast-loading services and apps and this
expectation will enhance network quality and cloud computing. This high-quality expectation
will also lead businesses to upgrade their platforms and services to be more responsive to the
needs of their customers. As the quality of the internet is enhanced, IoT and IoE industries will
benefit a great deal from the faster network speeds and the ability to receive and deliver data
more efficiently in real time.
One of the most important cloud computing trends 2018 will see is the increased solutions the
cloud will bring to security. 2017 saw the most cyber-attacks ever recorded in the history of the
internet and 2018 should be no different. Many experts predict 2018 will see more individual and
state-sponsored attacks aimed at undermining cloud infrastructure security. Cyber-attacks are
also becoming more sophisticated which means anyone in charge of their company’s security
will need to become more sophisticated in the way they detect and prevent these attacks. Cloud
services will be able to help companies with their security measures by offering managed
security services
Big data is term used to describe the data sets which are huge in size and yet growing
exponentially with time. Due to its large size and complexity, it cannot be captured or processed
efficiently by the standard software tools or databases. The data sizes are in the range of dozen
terabytes (1012 bytes) to many petabytes (1015 bytes) in single data set. It can also refer to data
that is unstructured and cannot be put in to standard formats and index of conventional database
e.g. image files, twitter comments, blogs, consumer reviews and feedbacks on products or
services, etc.
A. Structured
Structured data means the data that can be stored, accessed and processed in the form of well-
defined format is termed as a 'structured' data. This type of data exists in permanent fields inside
a record or file. Over a period of time, talent in computer science has made prominent progress
in developing techniques for working with such kind of data and also deriving value out of it.
However, nowadays, we are foreseeing issues when a size of such data grows to a huge extent.
Typical sizes are being in the rage of multiple zettabytes (1021bytes). The data stored in a
relational database is the best example for structured data. For example, An 'Employee' table in a
database.
   3. Gaming related Data: Each and every movement a user makes in the game can be useful
   to understand the behaviour of end -users.
B. Unstructured
Unstructured data refers to the data that does not have any well-defined format. It has an
unknown form or the structure. In addition to the immense size, un-structured data poses
multiple challenges in terms of its processing for deriving value out of it. Now days
organizations have wealth of data available with them but unfortunately, it is hard to derive value
out of it since this data is in its raw form or unstructured format. This type of data usually does
not exist in a traditional database. The output returned by 'Google Search', E-mail messages
containing text and multimedia data are examples of unstructured data.
  3. Mobile data: This includes the data generated from the mobile devices such as text and
  multimedia messages.
C. Semi-structured
Semi-structured data can contain both the forms of data. We can see semi-structured data as a
structured in form but it is actually not defined with e.g. a table definition in relational DBMS.
Example of semi-structured data is a data represented in an XML file. For example, the personal
data stored in an XML file.
   1.   Volume
   2.   Velocity
   3.   Variety
   4.   Veracity
   5.   Value
1. Volume:
The name Big Data itself is related to a size which is enormous. Size of data plays a very crucial
role in determining value out of data. In today's digitalized world of computer networks and
mobile devices, we are dealing with very large volume of data nearly terabytes (1012 bytes) or
petabytes (1015 bytes) of data. Hence, volume is known to be the size of the data which we are
dealing with day-to-day. It evaluates the massive amount of data in data stores and concerns
related to its scalability, accessibility and manageability.
2. Velocity:
The term 'velocity' refers to the speed of generation of data. How fast the data is generated and
processed to meet the demands, determines real potential in the data.
The data that is being generated on Stock Exchange (share value etc.), Facebook (likes, shares
etc.), Twitter (tweets), business processes, application logs, networks etc are of high speed. The
flow of data is massive and continuous. The organizations are in need of all these data which is
being generated and updated at high speed for analytical purposes.
3. Variety:
Variety refers to be the different types of data that are being generated from sources. It defines
the oneness of many classes of data and how they are being related with the other types. During
earlier days, spreadsheets and databases were the only sources of data considered by most of the
applications. Nowadays, there are a number of sources from which the data is being generated.
The Data in the form of emails, photos, videos, monitoring devices, PDFs, audios, etc. are also
being considered. This variety of data poses certain issues for storage, mining and analyzing of
data.
4. Veracity:
The term ‘veracity’ means the quality or trustworthiness of the data. Just how accurate is all this
data? It refers to inconsistencies and uncertainty of the information set which can be shown at
times, thus hampering the process of being able to maintain and manage the data effectively. So
the information could be interpreted meaningfully only by considering all the circumstances,
care settings and delivery units. The varying quality of data affects the precise surveys.
Equivalent technologies and tools might be used to collect the available data and analyze them,
but how the contents are used is up to them to figure out the veracity.
5. Value:
This refers to the worth of the data being extracted. Having endless amounts of data is one thing,
but unless it can be turned into value it is useless. While there is a clear link between data and
insights, this does not always mean there is value in Big Data. The most important part of
embarking on a big data initiative is to understand the costs and benefits of collecting and
analyzing the data to ensure that ultimately the data that is reaped can be monetized. Hence it can
be stated that “Value” is the most important V of all the 5V’s.
Big Data has become an integral part of businesses today. It contains huge volume of diverse
data which are being created at high speeds. To manage these types of data, organizations are in
need of new set of tools which can very well manage them and bring them into a form that will
be helpful for real-time analysis and decision making. Here’s a comprehensive list of Big Data
analytics tools available.
1. Hadoop
Big Data is sort of incomplete without Hadoop and expert data scientists would know that.
Hadoop is an open source framework, originally built by a Yahoo engineer named "Doug
Cutting", now managed by the "Apache Software Foundation”. Hadoop is able to store and
process large amounts of varied data sets in a dispersed manner across clusters of commodity
computers and hardware using a basic programming model. With its amazing processing power
and capability to handle innumerable tasks, Hadoop never allows you to ponder over hardware
failure. Main Key points about Hadoop are:
          Hadoop is the warehouse for storing raw data and helps in refining raw data.
          Hadoop is a dominant, cheap and lively place of archiving.
2. MongoDB
MongoDb is a contemporary alternative to databases. It’s the best for working on data sets that
vary or change frequently or the ones that are semi or unstructured. Some of the best uses of
MongoDB include storage of data from mobile apps, content management systems, product
catalogs and more.
3. Cassandra
Used by industry players like Cisco, Netflix, Twitter and more, it was first developed by the
social media giant Facebook as a NoSQL solution. It’s a distributed database that is high-
performing and deployed to handle mass chunks of data on commodity servers. Cassandra offers
no space for failure and is one of the most reliable Big Data tools.
4. Drill
It’s an open-source framework that allows experts to work on interactive analyses of large scale
datasets. Developed by Apache, Drill was designed to scale 10,000+ servers and process in
seconds petabytes of data and millions of records. It supports tons of file systems and databases
such as MongoDB, HDFS, Amazon S3, Google Cloud Storage and more.
5. Elastisearch
This open-sourced enterprise search engine is developed on Java and released under the license
of Apache. One of its best functionalities lies in supporting data discovery apps with its super-
fast search capabilities.
6. HCatalog
HCatalog allows users to view data stored across all Hadoop clusters and even allows them to
use tools like Hive and Pig for data processing, without having to know where the datasets are
physically present. A metadata management tool, HCatalog also functions as a sharing service
for Apache Hadoop.
7. Oozie
One of the best workflow processing systems, Oozie allows you to define a diverse range of jobs
written or programmed across multiple languages. Moreover, the tool also links them to each
other and conveniently allows users to mention dependencies.
8. Storm
Last but definitely not the least, Storm supports real-time processing of unstructured data sets. It
is reliable, fault-proof and is compatible with any programming language. Hailing from the
Apache family of tools, Twitter now owns Storm as an open-sourced real-time distributed
computing framework.
9. MapReduce
It is a software framework that serves as an enumerated layer of Hadoop. The MapReduce task
has two important parts.
          •The function where the data are being processed at the node or base level by
          dividing a query into several parts is known as “Map” function.
          •The function which serves as a result to the query from “Map” function is called the
          “Reduce” function.
10. BigTop
BigTop is an big analytics tool that attempts in creation of many proper
development or construction for bundling and concurrence testing of Hadoop's sub-projects
and referred to various modules with the aim of developing the Hadoop mechanism.
11. Sqoop
Sqoop is a correspondence tool used for movement of data files from non-Hadoop data
stores like relational databases and data warehouses into Hadoop. It enhances the
consumers to determine their final location interior of Hadoop and instruct Sqoop to replace data
from Oracle, Teradata or any other relational databases to the final destiny.
12. Hive
Hive is a Hadoop-based data warehousing-like framework which was originally developed by
Facebook. It allows users to write queries in a SQL-like language termed as HiveQL, which
are then converted to MapReduce. This grants SQL computer scientists who are
with no previous MapReduce experience to implement the warehouse and that will
make it easy to      integrate    with any    kind     of     business intelligence and also
visualization tools like Micro strategy, Tableau, Revolutions Analytics, etc.
13. Pig
Pig Latin is a Hadoop-based language developed by Yahoo. It is relatively simple to read and
is skillful , very long data pipelines (a limitation of SQL.) Apache Pig is a detachment over
MapReduce. It is a platform which helps in analyzing huge volume of data describing them as
data streams.       Pig is widely used     with Hadoop. Also we     can accomplish all    the
data administration functionalities in Hadoop using Pig.
14. Hcatalog
HCatalog is a concentrated centralized metadata management and distribution service for
Apache Hadoop. It permits for a simplified view of all variety of data in Hadoop clusters
and permits diverse tools, that includes Pig and Hive, to process various types of data elements
without the necessity to know about the location where in the cluster the data is being stored.
15. HBase
HBase is a kind of non-relational database that permits for low-latency, fast accessing in
Hadoop. It adds transactional capabilities to Hadoop, allowing customers to perform various
updates, inserts and deletes. EBay and Facebook use HBase widely. HBase is a NoSQL database
that functions on top of Hadoop as a distributed and scalable big data store. This means that
HBase can leverage the distributed processing paradigm of the Hadoop Distributed File System
(HDFS) and benefit from Hadoop’s MapReduce programming model.
Big Data has totally changed and revolutionized the way businesses and organizations work.
Organizations are leveraging from the benefits provided by big data applications. Big data has
created many opportunities across various industries. Some of the major applications of Big data
are listed below–
Media and Entertainment - Various companies in the media and entertainment industry are
facing new business models. They create, market and distribute their content. This is happening
because of current consumer’s search and the requirement of accessing content anywhere, any
time, on any device.
Big Data provides actionable points of information about millions of individuals. Now,
publishing environments are tailoring advertisements and content to appeal consumers. These
insights are gathered through various data-mining activities. Big Data applications benefit media
and entertainment industry by:
Health care – The combined data from Clinical research, Pharmaceutical R&D, Claims and
cost, Patient behaviour and sentiment data can help in identifying progress & outcomes of
treatment at a faster pace, identify fraud detection, checking accuracy & consistency of claims,
identify additional indications and discover adverse effects, analyze disease pattern, improve
R&D productivity and develop personalized medicine.
Researchers are mining the data to see which treatment is more effective for particular
conditions, identify patterns related to drug side effects, and gains other important information
that can help patients and reduce costs.
Energy and Utilities – The energy and utility industry is undergoing a large-scale
transformation using predictive analysis. Grids are getting smarter with the help of predictive
analysis. Electric power sources are getting cleaner and customers have more choices to receive
power. One of the technological drivers that has made an impact is the emergence of Big Data
and analytics which play a pivotal role in the industry. Big Data and analytics need to be
leveraged to be fed into analytical algorithm to:
   Monitor vast volumes of real-time data to find the generation and consumption patterns.
   Predict shortage in energy supply and future demand by studying those patterns.
   Help optimize the energy supply and demand cycle.
   Estimate if there is surplus energy being produced in some area and optimally redistribute
    the same to an area of energy shortage.
Manufacturing – This sector, which already uses information technology and data intensively to
drive quality and efficiency by automation to design, build and distribute products can also be a
major beneficiary of big data. Information captured through feedback from customers via various
means e.g. direct feedback, social forums like twitter, Facebook, product review from specialists,
etc. can help the manufacturing organizations to quickly cater to market and consumer
requirements by developing products as per demand.
Government Sector - Governments come face to face with a very huge amount of data on
almost daily basis. They keep track of various records and databases regarding their citizens,
their growth, energy resources, geographical surveys, and many more. All this data contributes to
big data. The proper study and analysis of this data helps governments in endless ways. Few of
them are as follows:
        Welfare Schemes
        Cyber Security
      Big Data is hugely used for deceit recognition.
      It is also used in catching tax evaders.
Education - Education industry is flooding with huge amounts of data related to students,
faculty, courses, results, and what not. Now, we have realized that proper study and analysis of
this data can provide insights which can be used to improve the operational effectiveness and
working of educational institutes.
Following are some of the fields in the education industry that have been transformed by big
data-motivated changes:
       Customized programs and schemes to benefit individual students can be created using the
       data collected on the bases of each student’s learning history. This improves the overall
       student results.
       Reframing the course material according to the data that is collected on the basis of what
       a student learns and to what extent by real-time monitoring of the components of a course
       is beneficial for the students.
 Grading Systems
 Career Prediction
       Appropriate analysis and study of every student’s records will help understand each
       student’s progress, strengths, weaknesses, interests, and more. It would also help in
       determining which career would be the most suitable for the student in future.
       The applications of big data have provided a solution to one of the biggest pitfalls in the
       education system, that is, the one-size-fits-all fashion of academic set-up, by contributing
       in e-learning solutions.
Transportation – With the various GPS based applications used by mobile users, security-
monitoring devices across major cities and other navigation systems like traffic monitoring,
accidents, scheduled roadwork and congested areas a huge amount of data is captured about
people and their movements.
Insurance – The use of Big Data has huge potential to improve the consumer offering of major
insurers, which in turn increases competitive advantage and makes these companies more
successful.
Insurance organizations are developing insurance products that use large amounts of data to
assess, select, price, predict and prevent risks that in some cases were previously considered
uninsurable. Going forward, access to data and the ability to derive new risk-related insights
from it will be key factors for competitiveness in the insurance industry.
Big data analytics helps Insurance companies in taking the correct decisions by providing them
with intuitive insights.
Banking And Securities – The amount of data in the banking sector is skyrocketing every
second. Proper study and analysis of this data can help detect any and all illegal activities that are
being carried out such as:
Various anti-money laundering software use Data Analytics for the purpose of detecting
suspicious transactions and analyzing customer data.
Personal data - Geo-targeted advertising, toll collection, people and automobile tracking etc can
use this data. Behaviour pattern of people in certain areas can be used as predictive analysis to
identify any social disruption or unrest. Mobile applications like foursquare, loot, places, etc. can
give locations of where the people using them are. It could be useful for people to keep a track of
their family members, friends, colleagues, etc. bringing more specific level of social interaction.
Public sector administration - The huge amount of data gathered by government agencies on
tax, benefits, public services, etc. can be used to improve operational efficiency savings,
reduction in fraudulent benefits, and increase in tax collections, understanding public sentiments
and improving public services.
Retail – This is another major sector where big data initiatives can bring lot of value for
organizations. It can help them in marketing, merchandising, operations, supply chain and
pricing. Understanding the buying patterns and sentiments (comments, reviews, feedback) of
consumers at product level, store level, geographic and demographic level can give valuable
insights and help the retail companies to plan targeted marketing campaigns. In-store consumer
behaviour and demand can help in planning appropriate merchandising. Analyzing the peak
periods and the footfall can help in managing operations and supply chain effectively.
These activities are carried out by most of the organizations currently but big data initiatives can
give them benefit of understanding and analyzing these trends at much larger and wider scale of
operations globally.
                              Chapter: MachineLearning
Introduction
Machine learning is a subfield of artificial intelligence. It’s goal is to understand the structure of
data and fit that data into models that can be understood and utilised by the people. Although
Machine learning is a part of computer science but it differs from traditional computational
approaches.
 If you are asked to write a logic for addition of two numbers, you would write it. But what if
you will be asked to do multiplication with the same code. It won’t be possible because the
program is written to do addition only. And that’s where machine learning comes into picture.
It allows the system to produce the desired result s and no need to write hard coded programs.
Addition of two numbers is a very small task but computer can perform even complex problems
in very less time. Just like it, world is full of data, a lot of data in the form of pictures, music,
videos, text, spread sheets and many more. Machine learning is deriving meaning out of all the
data. It’s all about learning the data, building the logic and predicting the output. It improves
certain task based on it’s past experience.
It is used worldwide in so many applications like :
      Image detection
      Recommendation
      Skin cancer detection
      Transportation
      Text and Speech
Any technology user today has benefitted from machine learning. Facial recognition technology
allow social media platform to tag people and share photos of friend. Recently, Apple ios 10
launched a feature, using this technology, of differentiating pictures in the gallery. It recognise
and by itself and make different folders of different people and objects.
 Machine learning is a subset of Artificial Intelligence, which provide system to automatically
learn and improve from their experience. For example : Recommendation engines powered by
machine learning suggests what movies or shows to watch next as per user’s preferences.
Here’s a definition by Tom Mitchell (a famous scientist) :
“A computer program is set to learn from experience ‘E’ with respect to some task ‘T’ and
performance measure ‘P’ if it’s performance at task in ‘T’ as measured by ‘P’, it improves
with experience ‘E’.”
Which simply concludes it improves a certain task based on past experience then, it has learnt.
    1. Supervised Learning :
   In this type of learning, outcome or output is known itself. We train machine using data
which is well labelled. Labelled data is the one which is tagged with correct answer and correct
output.
The data that machine takes is called training data which includes both inputs and labels (target).
To understand inputs and target, let’s take an example : If we add 5 and 6, we get 11 as result.
Here a=5 and b=6 are inputs and 11 is the label or result.
First we train the model with lots of training data(inputs and targets) then with the new data and
the logic, we get what output we predict.
This process is called supervised training which is really fast and accurate.
There are two types of supervised training :
     Regression
     Classification
   a. Regression :
      The type of problem where we need to predict the continuous response value. For
      example:
          What is the value of stock?
          What is the price of house in the city?
          How many runs will be on board in a cricket game.
   b. Classification :
      Type of problem where we predict the categorical response value where data can be
      separated into classes.
      The Yes/No type questions are binary classification. For example :
          This mail is spam or not?
       Will it rain today?
       Is this a picture of cat or dog? Etc.
 Multi class classification questions can be :
       Is this mail spam or promotion?
       Is this a picture of a cat, a dog or tiger?
2. Unsupervised Learning :
   Unsupervised learning is when we are dealing with data that has been labelled and
   categorized. In order to derive meaning, the goal is to find patterns and create structure in
   data.
   Unlabelled data is being used here. And it is mainly used for clustering (grouping)
   problems, where there is a need of finding relationships among the data given. Mainly
   used in descriptive modelling.
   For example : Watching a movie in a language which you don’t understand with no
   subtitles or no dictionary. Watching only one movie won’t be beneficial but if you watch
   hundreds and thousands of movies, your brain will start to form a model that how a
        language works. It will start recognising a pattern and you will start to expect certain
        sounds and words.
        In this type of learning, input data is given and model runs on it. Output or outcome of
        the given data is unknown.
        Types of unsupervised learning:
             Clustering
             Anomaly detection
     Clustering : It is a type of problem where we group similar type of problems together. It is
similar to multiclass classification but here, we don’t provide the labels, the system understands
the data itself and cluster the data.
   3. Reinforcement Learning :
          The ability of an agent to interact with the environment and find out what is the best
      outcome for any specific problem. After being a part of machine learning, it is a branch
      of artificial intelligence which follows the concept of hit and trial method. In order to
      produce agents, it goes through following steps :
           Agent observes the input states.
           In order to perform an action, decision making functions is used.
           The agent receives reward from the environment, after the action is performed.
           The information about the reward is stored
       For example : Training robots for bin packing
1.3 Characteristics
        In this step we collects the data for our machine so that it can works upon it.More the
        data , the better will be the potential of system and its predictions will much more near to
        accurate .It is the first and most important step in Machine Learning as the quality and
        quantity of data determines how good our prediction model will be. Well wait what is a
        Model? Model is nothing but the prediction system we will build in order to take decision
        on its own. It is build through the process called Training.Training will be done on the
        basis of whether a drink is a grape juice or a vine. For data we can have two aspects i.e.
        color and alcohol content.
Figure 1.1
        The data we collected is raw data, It needs to prepared for the our tasks of building
        model. We will place the data in the suitable places or colunms and use it for machine
        learning.We have to be sure that data is completed and it should not have any missing
        values .we will randomise the data while doing the analysis to check is data is biased or
        not . As biased data could leads to biased output .Randomisation is also done to be sure
        that it do not affect the output. Data we collect is not solely for the training purpose only
        though most of it is for training, but some of the data is kept for evaluation purposes. As
        we can’t use the same data for while we do the evaluation. It is similar to the questions
        you get for homework and what you get in exams. The reason behind this is we do not
      want our system to memorise the data, we want it to learn from it and if it is able to work
      on randomise data then our model comes a step closer to accurate.
1.4.3     Choosing Model
      There are many model in this field, model for image processing , text recognisation for
      numerals, alphabets etc or sound processing models. We must be selective and
      understand what type of model is suitable for our purposes.As we only have to work on
      variables i.e. color and alcohol content. It can work like a graph between two x and y.
      where y could be color and x could be alcohol content. Thus it is a linear model which is
      quite simple to understand.
1.4.4     Training
      Now the real process of machine learning starts where we start building our model. It is
      quite similar to data we start learning in the classrooms. In beginning we know nothing
      about the topic or question but as we progress we get to know how to solve the question.
      So at this stage we start training our model to solve the problems by giving some
      problems and data instructions to it.
       As we discussed in the third step, for the example of detection between wine and beer we
       are working on linear model. So it will have simple formula of straight line i.e. y=mx+b,
       where m will be our slope , b will be our y intercept x is input and y is output.
       We can have number of lines and number of slopes and intercept for those line.
       We will arrange those slopes(m) in matrix called weights and and intercept(b) in matrix
       Biases. Our training process will have random values from these matrices.
       While training there might be chances that output will not be as we desied and that’s why
       we have to arrange or update values accordingly in the matrices.
       Figure 1.2
       Our training process will have random values from these matrices.
       While training there might be chances that output will not be as we desied and that’s why
       we have to arrange or update values accordingly in the matrices.
      Figure 1.3
      After updating the values we will repeat the process again to check our data output. The
      process is repeated for random orders and random values and each repeation is
      considered ‘step’ of training. In beginning we will have a random straight line between x
      and y. but with repeatition our line will shift and in end will our line will be adequate on
      the graph with separation between x(color) and y (alcohol content).
1.4.5     Evaluation
      After proper training our model is ready for the evaluation purposes. Now the data we
      kept for the evaluation during second step(Preparing data) comes in hand.The process of
      Evaluation is similar to the Training process but it has dissmimilarities too. The data we
      used for training is to make the model for fit for evaluation process. But data used in
      Evaluation is unseen and to evaluate the performance of the model whether it can solve
      the real world problems without pre-defined data sets.
1.4.6     Hyper-Parameter Tuning
      Once the evaluation process is completed we can head on our model with values that we
      assume implicitly and can test those values. Hyper-parameters are the constant values and
      define how our model is structured they can not be predicted, They are random and done
      by try and error. The learning rate is also considered as hyper parameter i.e. shifting of
      line during our training process. These values plays a vital role in order to determine how
      accurate our model is.
      Figure 1.4
1.4.7     Prediction
      At this stage our model is now ready to use without any issues now it can solve real- life
      problems .You can deploy it and use it to for prediction of wine or beer.
Applications
             Web Search Engine : When search on Google about a specific topic it starts
              giving suggestion based upon the past searches and moreover if we search on a
              specific topics it will start giving suggestions related to those topics.
            Photo Tagging Applications : Be it facebook or instagram, the photo tagging
              application makes the app even more happening. It is all possible because of face
              recognisation application in machine learning.
            Spam Detection : The detection of spam mail or promotion or normal mail, it’s
              all a game of spam classifier which is running at the back of mail application.
            Online customer Service : Now a days a lot of websites running, offers an option
              to chat with customer care representative. But actually, most of the time, this
              representative is a trained bot itself, which gives all the information about
              websites.
Product Recommendations : When you shop something, you get a mail about it and also for
shopping recommendations. This refines your shopping experience but did you know all this
magic is done by machine l
                                       Deep Learning
1.1 Introduction
    Machine Learning is a vast topic and part of Artificial Intelligence, Deep Learning is subset
    of Machine learning which is inspired by the structure and function of brain called ANN(
    Artificial Neural Network). For the beginners it means how the computer will act like brain
    and how it will work. For example like our brain works with help of neurons or nerve cells
    Machine Learning algorithms works on Artificial Neural Network which helps in carryout
    the algorithms and process. It can be explained using a vein diagram.
    Learning is an on-going process, Humans adapt with the changes , faces the situations ,solve
    the problems and carryout different tasks based on their existing knowledge and their
    experiences. Quite similar machines are no different than Human body. Machines, now, too
    are able to take decisions based on the past knowledge ,i.e., past data that it has worked
    upon. A basic and the most suitable example of this is Google search Engine. When search
    on Google about a specific topic it starts giving suggestion based upon the past searches and
    moreover if we search on a specific topics it will start giving suggestions related to those
    topics. Well all that comes under Machine Learning but wait what is deep learning then?
    Deep learning as mentioned works on principle of Artificial Neural Networks, It is helps in
    recognition of pattern and data inspired by the Human brain.
    It has many types namely Convulational Neural Network, Recurrent Neural Networks,
    Feedforward Neural Network etc. we will discuss it about.
    Deep Learning also has many advantages and applications. As it automatically classifies and
    develop the data and modules to get the output.
                                           Figure 1.1
                                         Figure 2.1
 2.1.1 Long Short-term memory
   Long Short term memory is artificial recurrent neural network architecture. LSTM unit is
   composed of a cell, input gate, output gate and forget gate. Architecture of LSTM is
   described in the figure below:
Figure 2.2
    Application of LSTM :
       Robot Control
       Speech recognition
       Rhythm Learning
       Handwriting recognition
       Sign Language Translation
       Music Composition
To prevent this some measures such as reverse image search can be used in which fake image is
sent to sites such as TinEye which finds more similar examples of it and thus it detects the
vulenerablities and prevent Cyber-attacks.
   22% of the world population is on Facebook. 62% of people in the US are there. 76% of
   Facebook users and 51% of Instagram users are on it every day. One of the top 10 reasons
   people say they’re on social media is to buy products advertised to them. They spend around
   37% of their social media time interacting with branded content. 57% of Millennials say that
   social media has made the ads they see more relevant to them. 48% of people say they made
   their last online purchase as the direct result of a Facebook ad.
   Earlier, it was seen that with the arrival of big names like walmart, amazon etc affected the
   sales of local shop keepers and they all seen a decline in their sales. But, then after the advent
   of digital marketing, now everybody has a fair chance to advertise their products, reach out to
   a bigger audience and thus grow in a way which was never possible before.
                Digital marketing allows small businesses to compete with a much smaller
   advertising budget. When managed effectively, it gives them laser-focused control over
   where and how they spend their money. When you have this kind of control and the data to
   support decisions, you make smarter ones.
3. It helps in targeting the right customers
   Digital marketing helps in dividing the customers into subsets so as to better focus on the
   individual customer needs. When targeting is done at this level, then it makes the
   advertisements much more useful for the customers. Because it’s so relevant, it connects on a
   level that more general advertising can’t. This connection gives it the ability to influence
   decisions. With search advertising, you can target people with a very specific:
    Challenge
    Goal
    Profession
    Education level
    Buying behavior
      And more
   For example: Tell Facebook, for example, to only show your ad to people with a very
   specific recent behavior, interest, location or other identifier. You don’t spend thousands on
   one ad. And you can run ads indefinitely. So you can easily modify that ad to connect with
   different groups of people. You don’t have this level of control over who sees your ad with
   any other form of marketing.
   With digital marketing, various things can be known about the advertisements and the
   customers like whether the customers interacted with the ad, whether they interacted with it,
   if they shared it with their friend, if they moved on to know more about it etc.
   Apart from this one can also learn what kind of customers choose what kind of things, how
   much they can spend, how often they like to shop, through which website they reached onto
   your website etc.
   All this can be done with the help of simple tools like Google analytics or one can opt for
   paid tools.
   With the help of digital marketing, budgeting can be done on daily basis according to how
   much one wants to spend. For example, with social media campaigns, display ads and search
   ads you can choose a daily budget. You know exactly how much that campaign will cost you.
     The importance of digital marketing lies in the fact that you can get instant results. Analyze
   data and make changes fast to reduce wasted ad spend and lost revenues. There is no need of
   any negotiation with anybody, one can simply see if something is not working fine for the
   business, changes can be made instantly in the ad and new one can be floated in no time.
Digital marketing gives the best ROI. It can get a whopping 3800% return.
   For example, Email marketing is a conversion machine. But you do need a way to build your
   email list with quality subscribers. Then deliver highly relevant content to a subscriber’s
   inbox.
   A Forbes study found that companies using social media outsell 78% of businesses who
   don’t use social media.
   Whether it’s something a friend shared on social media, the result of a search query, or
   an email newsletter that they received in their inbox, it all starts here. The more integrated
   your business is with the customer’s online experience; the most easily you grow your
   business. You need an online presence to be relevant to the vast majority of customers.
   The importance of digital marketing is that it gives you that presence.
8. It is cost effective
   Digital Marketing provides an easy way to do the advertisements and reach out to the
   customers. It helps in doing the scheduling and budgeting regularly depending on the
   requirement.
   It has made it easier for even smaller business to grow at a rapid pace without much change
   in investment.
   These days contacting the social media page of a brand for issue resolution and other matters
   is a common thing to do. This, in turn, leads to the building up of strong image of the brand
   in the minds of the new consumers hence leading to more conversions.
   Digital advertisements can reach out to the people which otherwise cannot be reached
   through traditional methods of advertisements and this benefits the business a lot. The credit
   for this definitely goes to internet which is present in most of the areas worldwide.
   Customers are given utmost importance in the digital marketing. There is a lot of competition
   in the market. If one company is not able to satisfy the customer’s needs, customer has many
   other options.
   But for a business, if its customer is moving to some competitor is very bad. Therefore, with
   the help of digital marketing, they try to analyze the customer’s needs and accordingly
   strategize the marketing and modify the products so that higher profits can be earned.
12. Helps in business survival
     Even if you have tons of website visitors but none of them ever convert, your business will
     also cease to exist. Digital Marketing helps you make use of proven strategies and techniques
     that attract not necessarily more traffic – but highly targeted traffic that delivers results.
     Targeting the right kind of people that delivers the right kind of results is what Digital
     Marketing is all about – ensuring survival for your business.
     It is also referred to as content optimization. If one is using the right content which is suitable
     for both the existing and future prospective customers, then it’s business is most likely to
     grow.
     Usage of right keywords is very much necessary. If you want to make it a marketing strategy,
     the search engines rank highly optimized content higher on search engines page that the non-
     optimized content. Good content can improve SEO ranking and a good rank means it
     becomes most likely that customer will access your website.
2. Social media
     Most of the people who access internet today have their accounts on more than one social
     media platforms. As a result, social media becomes a very good platform for the
     advertisements of the products.
     A strong digital marketing strategy incorporates all social media forums appropriate to your
     organization, including Facebook, Twitter, LinkedIn, Google+, Pinterest, and Instagram.
     These tools have different purposes: Twitter has become a virtual telephone, a way for
     customers to lodge complaints or ask questions, whereas image-driven social media – such as
     Instagram and Pinterest – are a great way to get viral with visual storytelling. It’s also vital to
     stay connected to new trends, such as LinkedIn’s recent Influencers program, which
     promotes industry insiders as thought leaders and offers them a forum to share wisdom.
3. Website optimization
     Website optimization means that your website must be user friendly enough for the
     customers and it must provide all the important content. It is similar to SEO whereas SEO is
     more focused on optimized content, website optimization is focused on proper design
     because, in the first place, it reflects the image of your company. Website optimization
     involves designing a website from nothing. It involves adding keywords or phrases, image
     tags; editing Metadata to ensure that your site is accessible to a search engine.
4. Television advertisements
     Television is another important tool for advertisement which has a large audience. The
     advertisements are usually made by taking big names like famous actors, actresses, sports
     persons as brand ambassadors etc which influence people. People trust these celebrities and
     choose a particular product.
     Moreover, advertisements which are telecasted in the prime hours attract a larger audience.
     You can never go wrong with a television ad, however annoying it might be simply because
     it will still serve its objective, which is entering the minds of potential customers. This gives
     your business and what you are offering instant credibility. It allows you to become a
     creative marketer and attach a personality to the business, and this works effectively
     especially for small businesses.
     This form of marketing involves actual interaction with the prospective customers. The
     benefit of this form of advertisement is that one gets a chance to directly interact and hence
     influence the customers.
     This involves the customers showing various presentations, giving them demos so as to
     increase their interest. A face to face interaction builds trust which is important for the buyer-
     seller rapport.
6. Direct mail
     Despite being old, but it still one of the most influential tools of digital marketing. Here, the
     customer has full control over whether to read or ignore a email and thus faces less
     resistance. In case, the customer is completely disinterested they can unsubscribe from that
     service.
     To get the maximum profit out of this, the list to which an email is being sent should be good
     enough. Also, the content of the email should be proper so that even if a customer is not
     interested, the content creates the interest.
7. Cloud technology
Companies are using various cloud technology based platforms to advertise their products. One
thing that is very important while using cloud based technology is cyber security.
Now a days, cyber crimes are increasing at a fast and therefore, it is very important to be aware
of the all the risks involved. Marketers can use cybersecurity as a because they provide a great
deal of technology protection, thereby safeguarding the online marketing interests.
This technique of digital marketing involves marketers paying each time their advertisement is
clicked. It is considered as a prospective customer assessing the marketer’s website.
 It allows advertisers to bid for ad placement in a search engine's sponsored links when someone
searches on a keyword that is related to their business offering. For example, if we bid on the
keyword “PPC software,” our ad might show up in the very top spot on the Google results page.