Mad Unit 2
Mad Unit 2
UNIT II
Generic UI Development
1
2
• Data components provide a unified interface for binding visual All screen controllers implement the Frame Owner marker interface.
components to entities and for working with entities in screen The name of this interface means that it has a reference to a frame,
controllers. which is a visual component representing the screen when it is shown
• Infrastructure includes the main application window and other common in the main application window. There are two types of frames:
client mechanisms.
• Window – a standalone window that can be displayed inside the main
A screen is a main unit of the generic UI. It contains visual components, application window in a tab or as a modal dialog.
data containers and non-visual components. A screen can be displayed • Fragment – a lightweight component that can be added to windows or
inside the main application window either in the tab or as a modal other fragments.
dialog.
Controllers are also divided into two distinct categories according to the
The main part of the screen is a Java or Groovy class called controller. frames they use:
Layout of the screen is usually defined in an XML file called
descriptor. • Screen – a base class of window controllers.
• ScreenFragment – a base class of fragment controllers.
In order to show a screen, the framework creates a new instance of the
Window visual component, connects the window with the screen
controller and loads the screen layout components as child components
of the window. After that, the screen?s window is added to the main
application window.
2
3
Most of these user interactions are touch-based and happen on colorful semantics from the utterances. Some applications use speech input for
touch screen displays that are bursting with high-level interactions. form filling. However, filling each single slot by speech is often not
Naturally, basic mobile UI design principles differ from those of a
traditional desktop UI. more efficient then typing.
After all, users are, by definition, on the move; control is limited, The question arises: What are important challenges in using speech as
giving new meaning to the phrase >all thumbs. Actions and information
need to be big, bold, clear, and simple. a <mainstream= modality? While ASR made significant efforts within
the last years, e.g. partly driven by the successful application of deep
As mobile adoption continues to rise year-by-year, it?s time to develop
a mobile-first strategy, embraced by the likes of Facebook or other neural networks, the identification of the intended semantic for a further
social networks, who make sure their iOS and Android apps offer a
polished user experience on hand-held devices. processing by the dialog manager is still a rather difficult process. ASR
capabilities are easy to integrate into new user interfaces by making use
After all, when users have more choice and freedom to find mobile
applications that work for them, a poor user experience can easily of available programming APIs.
devalue your brand, hurt your revenue, and disengage your users.
On the technical side one of the next challenges is therefore to realize
Aside from investing in mobile applications, many ecommerce stores
see the increase of purchases coming from mobile. If an online store conversational speech interaction in many applications. This requires to
doesn?t optimize checkout experience, usability or their mobile app simplify the usage of NLP methods for information extraction, dialog
design, they may lose market share or even render themselves obsolete.
processing and presentation, so that developers can easily deploy
Multichannel and Multimodal UI speech interfaces.
3
4
at the hand of the users. In that matter it is important to better Swapping, swiping, dragging, long-pressing – these are but a few of the
understand the specific benefits that emerge for individual users. gestures that have come to dominate our digital experiences. Touch
Information about these benefits can be revealed by observing the
screen iPhones mainstreamed mobile gestures years ago, and we
users? modality choice behavior.
haven?t looked back since.
Understanding the factors influencing users? modality choice will Gestures affect how we interact with interfaces, including phones,
enable interface designers to adapt applications to the advantage of the laptops and iPads. But we don?t have to look far to find a gestural
user, and to inform the user about extra possibilities of interaction. interface beyond our work and entertainment devices. It?s no longer
Gesture Based UI uncommon to use gestures when interacting with car screens or
bathroom sinks.
Natural User Interfaces (NUIs) are so natural to users that the interface
feels, and sometimes is, invisible, like a touch screen interface. Some
NUIs even use gesture control, allowing users to interact with the
gesture control feature that gives users touch less control over car
volume, calls and more.
Gestures are growing more common in user interface design and play
increasingly complex roles in our everyday lives.
4
5
As technology advances, UX and UI designers and businesses will need ways, it makes using digital applications more fun, but this isn?t
to adapt. You don?t have to know all the technological intricacies or enough to make a gesture a good one.
have an in-depth knowledge of computer intelligence. Still, you should A good motion gesture improves usability by making applications
have a basic understanding of the capabilities, functions and best design easier to use in all contexts. Well-designed gestures have a shorter
practices for gesture technology. learning curve because they feel natural and are easy to pick up on. To
understanding your users is essential, even in gesture design. that come with them. Three of the most significant benefits of
Gestures cross the barrier between the physical and digital realms, gestures are cleaner interfaces, ease of use and improved task
allowing us to interact with digital media with our bodies. In some completion.
5
6
cluttered. Designers can use gestures to reduce the number of visual that all (or almost all) of users are familiar with – even if not
elements, like buttons, that take up space. consciously. We mention screens, but you can substitute the screen for
2. Ease of Use
a touchpad or any other gesture interface.
As discussed above, interactions become more natural with a gesture-
Tap
based interface. The ease of simple hand gestures allows us to use
A tap gesture is when you tap on the screen with one finger to open or
technology with minimal effort at maximum speed.
select something, like an app or page. Here?s a tip: Design clickable
3. Improved Task Completion interface elements so that the entire box or row is clickable – not just
Task completion rates and conversion rates increase when there?s less the text. Giving users more space increases usability.
6
7
Swipe contact with an element. Flings are often used to remove something
Swiping involves moving your finger across the screen in one direction, from view.
touching down on one side and lifting your finger on the other. Swipe Long Press
gestures are often used for scrolling or switching between pages. Tinder A long press is when you tap the screen but hold your finger down for
uses swiping right to match with a profile and swiping left to pass over longer than usual. Long presses open up menu options, like when you
You can also conduct a swipe gesture with two or three fingers. This is One of many two-finger gestures, a pinch is when you hold two fingers
a common feature on laptop touchpads that use two- and three- finger apart on the screen and then drag them towards each other in a pinching
swipes for different actions. motion. Pinch gestures are often used to zoom back out after zooming
in. Sometimes they present a view of all your open screens for
Drag navigation purposes.
Dragging uses the same general motion as a swipe, only you move your
Pinch-Open or Spread
finger slower and don?t lift it until you?ve pulled the object to where
you want it to be. You use dragging to move an item to a new location, A pinch-open or spread gesture is the opposite of a pinch. You hold
like when re-organizing your phone apps. your two fingers down close together and then spread them apart.
Fling Spreading, like double-tapping, is generally used to zoom in.
Like swiping, a fling gesture is when you move your finger across the
7
8
Rotation
To do a rotation, press on the screen with two fingers and rotate them
in a circular motion. The best example of rotation is when you turn the
The basic building block for user interface is a View object which is
created from the View class and occupies a rectangular area on the
screen and is responsible for drawing and event handling.
View is the base class for widgets, which are used to create interactive
This tutorial is more about creating your GUI based on layouts defined
UI components like buttons, text fields, etc.
in XML file. A layout may contain any type of widgets such as buttons,
The ViewGroup is a subclass of View and provides invisible labels, textboxes, and so on.
container that hold other Views or other ViewGroups and define their
layout properties. Following is a simple example of XML file having LinearLayout –
At third level we have different layouts which are subclasses of <?xml version="1.0" encoding="utf-8"?>
<LinearLayout
ViewGroup class and a typical layout defines the visual structure for an
xmlns:android="http://schemas.android.com/apk/res/android"
Android user interface and can be created either at run time using android:layout_width="fill_parent"
android:layout_height="fill_parent"
View/ViewGroup objects or you can declare your layout using simple
android:orientation="vertical" >
XML file main_layout.xml which is located in the res/layout folder of
<TextView android:id="@+id/text"
your project.
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="This is a TextView" />
8
9
RelativeLayout
<Button android:id="@+id/button"
android:layout_width="wrap_content" RelativeLayout is a view group that displays child views in relative
android:layout_height="wrap_content" positions.
android:text="This is a Button" />
Android RelativeLayout enables you to specify how child views are
<!-- More GUI components go here --> positioned relative to each other. The position of each view can be
specified as relative to sibling elements or relative to the parent.
</LinearLayout>
9
10
Android TableLayout going to be arranged groups of views into rows An Absolute Layout lets you specify exact locations (x/y coordinates)
and columns. You will use the <TableRow> element to build a row in of its children. Absolute layouts are less flexible and harder to
the table. Each row has zero or more cells; each cell can hold one View maintain than other types of layouts without absolute positioning.
object.
Table Layout containers do not display border lines for their rows,
columns, or cells.
Frame Layout
10
11
that's scalable to different screen sizes without the children overlapping inserted to the list using an Adapter that pulls content from a source
each other. such as an array or database.
ListView
11
12
List View
An adapter actually bridges between UI components and the data source
that fill data into UI Component. Adapter holds the data and send the
data to adapter view, the view can takes the data from adapter view and
shows the data on different views like as spinner, list view, grid view
etc.
The ListView and GridView are subclasses of AdapterView and they
can be populated by binding them to an Adapter, which retrieves data
from an external source and creates a View that represents each data
entry.
Android provides several subclasses of Adapter that are useful for
retrieving different kinds of data and building views for an
AdapterView ( i.e. ListView or GridView). The common adapters
are ArrayAdapter,Base
Adapter, CursorAdapter, SimpleCursorAdapter,SpinnerAdapter
and WrapperListAdapter. We will see separate examples for both
the adapters.
GridView
Android GridView shows items in two-dimensional scrolling grid
(rows & columns) and the grid items are not necessarily predetermined
An adapter actually bridges between UI components and the data source
but they automatically inserted to the layout using a ListAdapter
that fill data into UI Component. Adapter can be used to supply the data
to like spinner, list view, grid view etc.
12
13
Screen Elements accepted for submission by the World Wide Web Consortium (W3C)
as a standard for voice markup on the Web.
Input controls are the interactive components in your app's user
interface. Android provides a wide variety of controls you can use in
The VoiceXML language lets you use a familiar markup style and Web
your UI, such as buttons, text fields, seek bars, check box, zoom
server-side logic to deliver voice content to the Internet. The
buttons, toggle buttons, and many more. VoiceXML applications you create can interact with your existing
back-end business data and logic.
• Spoken input
• Telephone keypad input
• Recording of spoken input
• Synthesized speech output ("text-to-speech")
VoiceXML • Recorded audio output
• Telephony features such as call transfer and disconnect
What is VoiceXML? • Dialog flow control
• Scoping of input
The Voice eXtensible Markup Language (VoiceXML) is an XML-
based markup language for creating distributed voice applications that
users can access from any telephone.
13