MODULE 3
Planning the BI Project: Project planning is not a one-time activity. Since a project plan is
based on estimates, which are frequently no more than best guesses, project plans must be
adjusted constantly. The number one telltale sign that a project is not being managed is a static
project plan on which estimates and milestones have never changed from the day they were first
developed. Here is the sequence of activities for preparing a project plan.
1. Create a work breakdown structure listing activities, tasks, and subtasks.
2. Estimate the effort hours for these activities, tasks, and subtasks.
3. Assign resources to the activities, tasks, and subtasks.
4. Determine the task dependencies.
5. Determine the resource dependencies.
6. Determine the critical path based on the dependencies.
7. Create the detailed project plan.
What Are Project Resources?
Project resources are components that are necessary for successful project implementation.
They include people, equipment, money, time, knowledge – basically, anything that you may
require from the project planning to the project delivery phases. A lack of resources is therefore
a constraint on the completion of the project, that’s why resource management is the key project
management activity that defines a great part of the project success.
What Is Resource Management in Project Management?
Resource management is the process of planning, scheduling, and allocating resources
necessary for successful project delivery. Resource planning is an essential part of any project
management methodology that usually takes place at the early project stages.
For example, in Waterfall projects, resource planning is a part of the first Requirements phase
where all resources are allocated and scheduled very accurately because this methodology
doesn’t allow for changing project requirements. In contrast, in Agile projects, resource
management takes place every development cycle: after one cycle has been finished, a new
cycle starts with a new resource planning phase.
What is risk management and why is it important?
Risk management is the process of identifying, assessing and controlling threats to an
organization's capital, earnings and operations. These risks stem from a variety of sources,
including financial uncertainties, legal liabilities, technology issues, strategic management
errors, accidents and natural disasters.
A successful risk management program helps an organization consider the full range of risks it
faces. Risk management also examines the relationship between different types of business
risks and the cascading impact they could have on an organization's strategic goals.
Risk management has perhaps never been more important than it is now. The risks that modern
organizations face have grown more complex, fueled by the rapid pace of globalization. New
risks are constantly emerging, often related to and generated by the now-pervasive use of digital
technology. Climate change has been dubbed a "threat multiplier" by risk experts.
A recent external risk that initially manifested itself as a supply chain issue at many companies
-- the COVID-19 pandemic -- quickly evolved into an existential threat, affecting the health and
safety of employees, the means of doing business, the ability to interact with customers and
corporate reputations.
Businesses made rapid adjustments to the threats posed by the pandemic. But, going forward,
they are grappling with novel risks, including the ongoing issue of how or whether to bring
employees back to the office, what can be done to make supply chains less vulnerable, inflation
and the business and economic effects of the war in Ukraine.
In many companies, business executives and the board of directors are taking a fresh look at
their risk management programs. Organizations are reassessing their risk exposure, examining
risk processes and reconsidering who should be involved in risk management. Companies that
currently take a reactive approach to risk management -- guarding against past risks and
changing practices after a new risk causes harm -- are considering the competitive advantages
of a more proactive approach. There is heightened interest in supporting business sustainability,
resiliency and agility. Companies are also exploring how AI technologies and sophisticated
GRC platforms can improve risk management.
What is a cost justification strategy and why do you need one?
A cost justification strategy is a way to show why the proposed costs for a particular project are
needed to accomplish the project tasks. It is beneficial to justify the costs as this is a way to
clearly identify how necessary it is to spend the proposed money. The cost justification strategy
is there to provide a complete snapshot of the proposed work and it helps to justify both the
need and the price.
Requirements Gathering: Requirements gathering is the process of understanding what you
are trying to build and why you are building it. Requirements gathering is often regarded as a
part of developing software applications or of cyber-physical systems like aircraft, spacecraft,
and automobiles (where specifications cover both software and hardware). It can, however, be
applied to any product or project, from designing a new sailboat to building a patio deck to
remodeling a bathroom.
Three Main Subprocesses of Requirements Gathering
The key ingredients of the requirements gathering process are three overlapping subprocesses:
requirements elicitation, requirements documentation, and requirements understanding.
Requirements elicitation is the process of asking for and collecting top-level requirements
from all relevant stakeholders. Effort should be made to account for the needs of
customers, their users, the internal stakeholders within your own company, and your
key suppliers.
Requirements documentation organizes the input from the requirements elicitation process
into whatever format is appropriate for your organization. This formatting may include:
User stories
Functional decompositions (especially for complex cyber-physical systems)
Feature descriptions
These will be collected in a top-level requirements specification like a product requirements
document (PRD) or a system specification. The purpose of this top-level specification is
to make those stories and descriptions available to all members of the project team.
Requirements confirmation is the process of making sure all stakeholders and team members
have a common understanding of what you’re trying to build. This involves reviewing and
refining the requirements. It will very likely require additional elicitation and revision of
the documentation as well.
What are the Benefits of Requirements Gathering?
Beyond the obvious advantage of having requirements with which to work, a good
requirements gathering process offers the following benefits:
Greatly improves the chances that customers and users will get what they
wanted. Stakeholders often have difficulty putting into words exactly what it is that
they need. You’re going to have to help them, and it’s going to take some digging.
Decreases the chances of a failed project. A frequently heard lament following
unsuccessful projects is, “The requirements weren’t clear.”
Reduces the overall cost of the project by catching requirements problems before
development begins. Requirements that are ambiguous or not fully understood often result
in costly scrap and rework. Numerous studies have shown that the cost of fixing
requirements errors rises exponentially over subsequent phases of development.
How do you prioritize and validate user requirements with limited time and resources?
Use a structured framework: One of the first steps to prioritize and validate user requirements
is to use a structured framework to guide your process. A framework can help you define the
scope, goals, and objectives of your project, identify the key stakeholders and users, and
establish the criteria and methods for prioritization and validation. A common framework for
requirements gathering is the MoSCoW method, which stands for Must have, should have,
could have, and won’t have. This method helps you categorize the user requirements based on
their importance, urgency, and value. Another framework is the Kano model, which classifies
the user requirements based on their impact on user satisfaction and dissatisfaction.
Conduct user research
Another step to prioritize and validate user requirements is to conduct user research to
understand the needs, preferences, and pain points of your target users. User research can help
you discover the problems and opportunities that your system, product, or service can address,
as well as the features and functions that your users expect and desire. User research can be
done using various methods, such as interviews, surveys, observations, focus groups, usability
tests, and user personas. The key is to choose the methods that are suitable for your project
scope, budget, and timeline, and to involve the users as early and as often as possible in the
process.
Use visual tools and techniques
Prioritizing and validating user requirements can be done effectively with visual tools and
techniques. User stories, user journey maps, wireframes, and prototypes are all useful for
capturing user needs and expectations in a concise and consistent way, as well as identifying
pain points, opportunities, and emotions that the user experiences. Additionally, these visual
tools and techniques can help to visualize and test the user requirements, as well as solicit
feedback and input from stakeholders and users.
Review and refine
The final step to prioritize and validate user requirements is to review and refine them based on
the feedback and validation that you receive from your stakeholders and users. Doing so can
help ensure that the requirements are clear, complete, accurate, and aligned with the project
goals and objectives. To review and refine user requirements, you can use techniques such as a
requirements traceability matrix or a requirements review. The former allows you to track and
verify that the user requirements are met and fulfilled throughout the project lifecycle, while the
latter involves different stakeholders and users to identify any issues, gaps, or conflicts in the
user requirements.
What is change requirement?
Changes in requirement can begin as early as the point at which the requirement is elicited and
can last beyond the product's release (in the form of maintenance). The process is thus defined
as a system of managing changing requirements during the requirements engineering process
and system development.
BI Application Design
KPIs are identified in Business Requirements Definition
Each KPI should be mapped on to one or more appropriate forms of BI components such as:
Reports
Dashboards and Scorecards
Pivot Tables
Arrange KPI Visualizations into logical groups
BI Design Definitions
In order to ensure all BI Applications, have a similar look and feel, the following standards
should be identified, documented and used during the development stage:
Common layout standards – Navigation, filtering, visualization locations, etc.
Delivery Platform standards – including desktop, web, mobile app, etc.
Visualization Type standards – including the types of charts and their overall look and
feel.
User Interface standards – including colors, fonts, logos and other boilerplate materials.