0% found this document useful (0 votes)
20 views62 pages

Industrial Training Report

The document is an industrial training report submitted by Vansh Sardana for the Voltus-xFi project at Cadence Design Systems, focusing on research and development in power integrity solutions. It outlines the project's objectives, including improving efficiency, maintainability, and competitiveness by addressing compile-time warnings, dependency management, and high memory utilization. The report also includes acknowledgments, a company profile, and a detailed table of contents for the project work conducted during the internship.

Uploaded by

d14296388
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views62 pages

Industrial Training Report

The document is an industrial training report submitted by Vansh Sardana for the Voltus-xFi project at Cadence Design Systems, focusing on research and development in power integrity solutions. It outlines the project's objectives, including improving efficiency, maintainability, and competitiveness by addressing compile-time warnings, dependency management, and high memory utilization. The report also includes acknowledgments, a company profile, and a detailed table of contents for the project work conducted during the internship.

Uploaded by

d14296388
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 62

INDUSTRIAL TRAINING REPORT

on
VOLTUS-xFi : RESEARCH AND DEVELOPMENT

Submitted in partial fulfillment of the requirement for the degree of

Bachelor of Technology (B.Tech)


in
Electronics Engineering (Internet Of Things)
(2021-2025)

Submitted By: -
Vansh Sardana (21001017067)

Under the mentorship


of
Mr. Ritesh Agrawal
Software Engineering Director
Cadence Design Systems
Noida Special Economic Zone, Noida, 201305 India

Department of Electronics Engineering


J.C. BOSE UNIVERSITY OF SCIENCE & TECHNOLOGY
YMCA, FARIDABAD

2
CANDIDATE’S DECLARATION

I hereby certify that the work which is being presented in this project report titled
“Voltus-xFi : Research and Development” submitted to “J. C. Bose University of
Science & Technology, YMCA, Faridabad”, is an authentic record of my own work
carried out in the company “Cadence Design Systems, Noida” The work contained in
this thesis has not been submitted to any other University of Institute.

Student Signature :

Student Name : Vansh Sardana

Student Roll No. : 21001017067

3
EXAMINERS EVALUATION

The project report has been evaluated by us.

Internal Examiner

Signature: ___________________

Name: ___________________

Designation:- ___________________

Date: ___________________

External Examiner

Signature: ___________________

Name: ___________________

Designation:- ___________________

Date: ___________________

4
ACKNOWLEDGEMENT

It gives me immense pleasure to take this opportunity to express my heartfelt gratitude to


Cadence Design Systems, Noida for giving me the opportunity to complete my training
with their esteemed organization. This training experience has been invaluable in shaping
my knowledge, skills, and professional growth. I am grateful to have had the chance to
work alongside a team of talented and dedicated professionals who have been
instrumental in my learning journey.

I would like to extend my sincere gratitude to Mr. Ritesh Agarwal, Software


Engineering Director, Cadence Design Systems, for his exceptional guidance and
mentorship throughout my training at the company. His expertise and dedication to his
work have been truly inspiring, and I am grateful for the opportunities he provided me to
learn and grow.

I would also like to express my deepest appreciation to Dr. Munish Vashishth, Head of
the Department of Electronics Engineering, and Prof. Rajesh Kr. Ahuja, Training and
Placement Officer, for their invaluable contributions in providing me with an excellent
opportunity to pursue this internship. Their support and encouragement have been
instrumental in helping me secure this valuable experience.

I would also like to extend my thanks to Ms. Archana Jain (Faculty Mentor), Ms.
Archana Agarwal (Faculty Mentor) and all the faculty members of the Department of
Electronics Engineering at J.C. Bose University of Science and Technology, YMCA,
Faridabad for their constant encouragement and assistance in shaping my academic and
professional journey.

I express my heartfelt thanks to my parents, friends and colleagues for their constant
support, love and motivation.

Vansh Sardana

21001017067

5
6
TABLE OF CONTENTS

S.No Contents Page No.

I. Training Completion Certificate ii

II. Candidate’s Declaration iii

III. Examiner’s Evaluation iv

IV. Acknowledgement v

V. Table of Contents vi

1. Chapter 1 : Company Profile 1

2. Chapter 2 : Introduction to Project 6

3. Chapter 3 : Requirement Analysis 12

4. Chapter 4 : Design And Methodology 15

7
5. Chapter 5 : Tech Stack Used 23

6. Chapter 6 : Information About Testing 36


Strategy

7. Chapter 7 : Screenshots of GUI 40

8. Chapter 8 : Conclusion and Future Scope 49

9. References 52

10. Brief Profile of Student 53

8
Chapter 1

COMPANY PROFILE

1.1 HISTORY OF THE COMPANY

Cadence Design Systems was established in 1988, as a public American multinational


company, specializing in electronic design automation (EDA) software and
CHAPTER 1
engineering services. With the rapid advancement of semiconductor and electronics
technologies, Cadence’s tools and solutions enable companies to design and verify
integrated circuits (ICs), systems on chips (SoCs), and printed circuit boards (PCBs)
COMPANY PROFILE
with efficiency and accuracy. Cadence plays a pivotal role in the development of
cutting-edge electronics, empowering innovation in consumer devices, automotive,
aerospace, and more. Rooted in the idea of harmony between creativity and precision,
Cadence helps engineers and designers bring their ideas to life from concept to
silicon.

Headquarters: San Jose, California, United States

Website: https://www.cadence.com/

Industry: Electronic Design Automation (EDA), Semiconductors, Software


Development

1.2 DIFFERENT PRODUCTS OF CADENCE DESIGN SYSTEMS

1. Virtuoso Platform

Custom IC Design for Analog, RF, and Mixed-Signal Circuits

2
● Schematic Capture and Simulation: Provides an intuitive environment for
creating analog and mixed-signal designs and running detailed circuit
simulations.
● Layout and Physical Design: Offers advanced layout editing, automation,
and analog placement and routing tools to accelerate design closure.
● Mixed-Signal Verification: Integrates with simulation tools like Spectre
and supports behavioral modeling for mixed-signal SoCs.
● Design Rule Checking and Verification: Ensures manufacturability and
performance with integrated physical and electrical rule checking.
● Advanced Node Support: Optimized for cutting-edge process nodes like
5nm and below, enabling design scalability and performance.

2. Spectre Simulation Platform

Accurate and Scalable SPICE Simulation.


● Analog and RF Simulation: High-performance simulation engine for
precise analog, RF, and mixed-signal circuits.
● Fast Convergence and Scalability: Supports large-scale designs with
reduced runtime and memory consumption.
● Noise and Reliability Analysis: Simulates circuit noise and aging effects to
ensure long-term reliability.

3. Innovus Implementation System

Full-Flow Digital Physical Design


● RTL-to-GDSII Flow: Seamless integration from synthesis to physical
implementation and signoff.
● Concurrent Clock and Data Optimization: Enhances timing closure and
power efficiency.
● AI-Driven Placement and Routing: Machine learning-guided placement
for faster design convergence.

4. Genus Synthesis Solution

High-Performance RTL Synthesis.

3
● Logic Optimization and Area Reduction: Helps reduce silicon area and
improve performance.
● Multi-Scenario Synthesis: Optimizes designs for various process-voltage-
temperature (PVT) conditions.
● Incremental Synthesis: Shortens turnaround time for large SoC designs.

5. Palladium Enterprise Emulation Platform

Hardware Emulation for Pre-Silicon Verification


● Fast SoC Validation: Enables software bring-up and hardware/software
co-verification before tape-out.
● Concurrent Multi-User Support: Allows parallel verification by multiple
teams.
● Emulation of Entire Systems: Supports large and complex designs with
high fidelity.

6. Protium Prototyping Platform

FPGA-Based Prototyping for Early Software Development


● Early Software Testing: Supports OS and application-level testing before
silicon availability.
● Fast Prototyping Turnaround: Accelerates development cycle by enabling
rapid hardware debugging.
● Seamless Transition from Palladium: Shared front-end toolchain enables
quick migration between platforms.

7. Allegro PCB Design Suite

Comprehensive Board-Level Design and Analysis


● Constraint-Driven Design: Enforces electrical and design rules throughout
the workflow.
● High-Speed Design Support: Enables signal integrity and timing-driven
routing for modern PCB needs.
● ECAD-MCAD Co-Design: Facilitates cross-domain collaboration for
mechanical and electrical design.

4
8. Allegro PCB Design Suite

PCB and IC Package Analysis


● SI/PI Analysis: Ensures robust signal integrity and stable power delivery
in high-speed designs.
● Power-Aware Simulation: Validates PDN performance for high-reliability
systems.
● Thermal and EMI Modeling: Assists in evaluating thermal and
electromagnetic effects.

9. Clarity 3D Solver

Next-Generation Electromagnetic Simulation


● Massively Parallel EM Analysis: Enables fast, accurate simulation of
high-frequency components.
● AI-Assisted Modeling: Reduces setup time and improves simulation
efficiency.
● System-Level Integration: Ideal for high-speed interconnects in 3D ICs,
PCBs, and packages.

5
Chapter 2

INTRODUCTION TO PROJECT

Securing an internship at Cadence was a challenging and rewarding experience that


required dedication, perseverance, and hard work. As part of the backend sub-team of the
Voltus team, I had the opportunity to work on real-world projects that had a significant
impact on the company's products and services. This project was undertaken as part of
my 8th semester, allowing me to apply theoretical knowledge to practical problems and
gain valuable industry experience. Throughout the internship, I was able to develop my
skills and expertise in software development, problem-solving, and teamwork.

Voltus-xFi is a product of Cadence, a leading provider of electronic design automation


(EDA) and semiconductor intellectual property (IP). The Cadence Voltus-XFi Custom
Power Integrity Solution is a transistor-level electromigration and IR drop (EM-IR) tool
CHAPTER 2
that delivers foundry-supported SPICE-level accuracy for power integrity signoff.
Voltus-xFi is designed to provide advanced functionality in its domain, and as a backend
team member, I contributed to enhancing its performance, efficiency, and reliability. The
INTRODUCTION TO PROJECT
product is used by various organizations and industries, and my work on it had the
potential to make a significant impact on the company's success.

Figure 1: Different Teams working for the development of Voltus-xFi

7
TEAM STRUCTURE:

Research and development Team is divided into three sub-teams:


● Backend : EM (Electro-migration)
● Frontend : Tungsten
o Main UI
o Result Browser

Figure 2: Sub-teams in Research and Development Team of Voltus Fi

2.1 PROJECT MOTIVATION

The motivation behind this project was multifaceted and driven by several key factors.

● Improving Efficiency: By reducing dependencies and optimizing memory usage,


we aimed to make Voltus-xFi more efficient in terms of resource utilization. This
would enable the product to perform better, respond faster, and handle larger
workloads.
● Enhancing Maintainability: The project sought to simplify the codebase by
removing unnecessary kits and legacy code. This would make it easier for
developers to understand, modify, and maintain the code, reducing the time and
effort required for future updates and fixes.
● Increasing Competitiveness: With competitors providing similar functionality
with lower memory usage (between 60 GBs and 100 GBs), Cadence's tool was at
a disadvantage. By optimizing memory usage to around 180 GBs to a more
competitive level, we aimed to make Voltus XFI more attractive to customers and
increase its market share.

8
● Reducing Costs: By streamlining the codebase and reducing dependencies, we
expected to lower development and maintenance costs. This would enable the
company to allocate resources more effectively and invest in other areas of the
product.
● Improving Customer Satisfaction: Ultimately, the project's goal was to deliver a
better product to customers. By improving performance, reducing errors, and
enhancing overall quality, we aimed to increase customer satisfaction and build
loyalty.

2.2 PROBLEM IDENTIFICATION

Figure 3: Problem
Identification in
Voltus-xFi

The project aimed


to address the
following key issues
in Voltus-xFi:

1. Compile-Time Warnings

● The codebase was generating a significant number of compile-time


warnings, which can indicate potential issues and make maintenance more
difficult.
● These warnings can also lead to errors, crashes, or unexpected behavior,
impacting the overall quality and reliability of the product.

2. Dependency Management Issues

● The product had a large number of dependencies, many of which were


unnecessary or redundant.
● The team lacked visibility into which kits were being used where and why,
and whether these kits were actually needed.
● The kits were a part of the legacy codebase, and over time, many of them
had become obsolete or redundant, leading to increased compile times,
maintenance challenges, and potential errors.

3. High Memory Utilization

9
● The product's memory utilization was high, with competitors providing
similar functionality with memory usage between 60 GBs and 100 GBs.
● Cadence's tool was using approximately 180 GBs, making it less
competitive and potentially impacting performance.

4. Lack of daily build notifications

● There was no automated mechanism to track and report daily build status
and warning trends, making it challenging for the team to identify issues
early on and track progress.

Each of these issues presented a unique challenge, but addressing them would have a
significant impact on the product's performance, maintainability, and overall quality.

2.3 PROJECT OBJECTIVE

The objective of this project was to improve the overall efficiency, maintainability, and
reliability of Voltus-xFi by achieving the following specific goals:

1. Compile-Time Warning Reduction

● Identify and fix code issues that cause compile-time warnings


● Reduce the number of warnings to improve code quality and
maintainability
● Gain familiarity with the codebase
● Utilize debugging tools such as GDB and Valgrind to identify and fix
complex issues

2. Dependency Reduction

● Identify and remove unnecessary dependencies in the codebase


● Determine where the required kits are actually being utilized by the product

3. Memory Optimization

● Optimize memory usage to make the product more competitive


● Reduce memory utilization to a level comparable with industry
competitors (between 60 GBs and 100 GBs)

10
4. Automated Daily Build Tracking

● Develop and implement an automated daily build tracking system


● Provide visibility into daily build status and warning trends
● Enable the team to track progress and identify issues early on

2.4 PROJECT SCOPE

The project scope includes reducing unnecessary dependencies, optimizing memory


usage, minimizing compile-time warnings, and implementing an automated daily build
tracking system for Voltus XFI. This involves analyzing the codebase, identifying areas
for improvement, and implementing changes to reduce dependencies, memory utilization,
and warnings. Additionally, the project includes developing and configuring an
automated daily build tracking system to provide visibility into build status and warning
trends, ultimately delivering a more efficient, maintainable, and reliable product.

11
Chapter 3

REQUIREMENT ANALYSIS

The requirement analysis for this project involved a thorough examination of the Voltus-
xFi codebase and the identification of key objectives and requirements for the tasks
performed. The primary goals of the project were to reduce unnecessary dependencies,
optimize memory usage, minimize compile-time warnings, and implement an automated
daily build tracking process.

3.1 REQUIREMENTS

To achieve the project objectives, the following requirements were identified:

CHAPTER 3
● Analyze the codebase to identify areas for improvement, including unnecessary
dependencies, memory-intensive code, and compile-time warnings.

REQUIREMENT
● Implement changes to ANALYSIS
reduce dependencies, memory utilization, and warnings,
while ensuring that the functionality and behavior of the product are not impacted.
● Use debugging tools such as GDB and Valgrind to identify and fix complex issues.
● Develop a script to automate daily build tracking and reporting, providing
visibility into build status and warning trends.

3.2 FUNCTIONAL REQUIREMENTS

The functional requirements for the project included:

● Identifying and removing unnecessary dependencies in the codebase.


● Optimizing memory usage to reduce memory utilization.
● Fixing code issues that cause compile-time warnings.
● Automating daily build tracking and reporting.

3.3 PERFORMANCE REQUIREMENTS

The performance requirements for the project included:

13
● Ensuring that the changes made to the codebase do not impact the functionality
and behavior of the product.
● Optimizing memory usage to make the product more efficient and competitive
with minimum reduction in precision.

3.4 METHODOLOGY

To achieve the project objectives and meet the requirements, the following methodology
was employed:

● Code analysis: The codebase was thoroughly analyzed to identify areas for
improvement.
● Code refactoring: Changes were made to the codebase to reduce dependencies,
memory utilization, and warnings.
● Debugging: Debugging tools such as GDB and Valgrind were used to identify and
fix complex issues.
● Automation: A script was developed to automate daily build tracking and
reporting.

By understanding the requirements and objectives of the project, the team was able to
deliver the expected outcomes and achieve the project goals. The requirement analysis
played a critical role in ensuring that the project was well-planned and executed, and that
the deliverables met the needs of the stakeholders.

14
Chapter 4

DESIGN AND METHODOLOGY

4.1 BRIEF

Across all projects, a structured approach was employed to ensure successful outcomes.
Key design considerations included:

1. Thorough analysis and identification of areas for improvement


2. Tailored methodologies to meet specific project needs
3. Automation and scripting to streamline processes and reduce manual effort
4. Rigorous testing and verification to ensure functionality and accuracy, and
5. Effective communication and reporting to stakeholders.

By considering these factors, the projects were able to achieve significant benefits,
including improved efficiency, reduced memory utilization, minimized warnings, and
enhanced productivity..
CHAPTER 4
4.2 DEPENDENCY OPTIMISATION

DESIGN AND METHODOLOGY


The dependency reduction project was a crucial step in improving the maintainability
and efficiency of Voltus-xFi. The project involved a thorough analysis of the
codebase to identify unnecessary dependencies, which were then removed to reduce
the complexity of the code. The methodology employed in this project included a
comprehensive review of the codebase, identification of unused kits and libraries, and
removal of unnecessary dependencies.

4.2.1 Legacy Kit Dependency Analysis

1. Kit Removal Analysis: Two methods were employed to analyze kit


dependencies:
● Manual Removal: Each kit was manually removed, and
dependencies were checked. This method was time-consuming,
requiring 3 hours per kit.
● Automated Removal: A script was developed to automate kit
removal, make clean builds, and save log files. The logs were
analyzed the next day to determine kit usage.

16
2. Kit Usage Identification: A reference guide with error messages was
created to identify required kits.
3. Dependency Reduction: 143 unrequired kits were identified and removed,
reducing dependencies by 45% and improving product size and
maintainability.

Figure 4: Kit Dependency Optimization Process

4.2.2 Kit Dependency Removal through Code Refactoring

1. API Behavior Analysis: The API behavior was confirmed, and multiple
edge cases were verified.

2. Code Refactoring: Two methods were employed to replace kit


dependencies:
● Manual Replacement: Used for 2500+ occurrences of complex
APIs.

17
● Script-Based Replacement: Used for 4000+ occurrences of simple
APIs, such as data types or simple functions.

Figure 5 : Required Kit Dependency Removal Process

4.2.3 Kit Bundling

1. Kit Analysis: The kits were analyzed to


identify which libraries were coming from
which kit.

2. Library Usage Identification: Only 19 out of


88 libraries from the kit were found to be
actually used by Voltus-xFi.

3. Kit Consolidation: The information was


forwarded to the required team, and a single
kit was created in lieu of 13 kits.

4.3 MEMORY OPTIMIZATION OF VOLTUS-XFI

18
1. Memory Profiling: The first step was to identify areas of high memory utilization
in the code. This was done through memory profiling, which helped to pinpoint
specific classes and functions that were consuming excessive memory.

2. Code Analysis: Once areas of high memory


Figure 6 : Kit Bundling Workflow
utilization were identified, a thorough code
analysis was conducted to understand the root cause of the issue.

3. Optimization Techniques: Based on the analysis, various optimization techniques


were employed to reduce memory utilization. These included:

19
Figure 7: Memory Optimization Process

● Data Type Optimization: Changing data types to more efficient ones, such
as from double to float.
● Code Refactoring: Refactoring code to reduce unnecessary memory
duplication and improve efficiency.
● Report Generation Optimization: Modifying code to generate reports only
for specified currents mentioned in the config file.

4. Testing and Verification: After implementing optimizations, thorough testing and


verification were conducted to ensure that the changes did not impact the
functionality and precision of the tool.

4.4 COMPILER WARNING REMOVAL

Figure 8: Compiler Warning Removal Process

The compiler warning removal project aimed to minimize warnings in the


codebase, improving code quality, maintainability, and reducing potential issues.
1. Warning Analysis: A thorough analysis of compiler warnings was conducted
to identify the root causes of the warnings.

20
2. Code Review: A code review was performed to understand the context and
impact of each warning.
3. Warning Fixing: Warnings were fixed by addressing the underlying issues,
such as unused variables, type mismatches, syntax errors
4. Code Refactoring: Code was refactored to improve readability,
maintainability, and reduce potential issues.
After fixing the warnings, the number of warnings was reduced to only 28
warnings from a total of 1583 warnings, which were due to other kits or more
complex issues that required further investigation and resolution.

4.5 AUTOMATED BUILD AND WARNING COMPARE SCRIPT

Figure 9: Automated build and warning compare script workflow

1. Script Development: The script was developed to perform the following tasks:
● Identify Code Base: Determine the code base, version, and port being
used.

21
● Run Make Commands: Execute the required make commands based on
the identified parameters.
● Save Logs: Save the logs generated during the build process.
2. Warning Comparison: A Perl script was developed to:
● Count Warnings: Count the number of warnings generated during the
build process.
● Compare Warnings: Compare the current warnings with the previous
warnings to identify any changes.
3. Notification: An email was sent to the team group on Outlook with the results
of the build and warning comparison.
4. Automation: The script was automated using a cron job, ensuring that it ran at
regular intervals without manual intervention.

22
Chapter 5

TECH STACK USED

5.1 C / C++

C and C++ are powerful general-purpose programming languages that form the
backbone of many modern software systems, especially those requiring performance,
control over hardware, and resource efficiency. In the context of this internship
project, C and C++ were chosen for their close-to-metal capabilities, deterministic
behavior, and efficient execution, particularly in scenarios involving memory-
sensitive operations and low-level system interactions.

5.1.1 Why C/C++ Was Used

The choice of C and C++ was driven by the nature of the project, which
CHAPTER 5
involved developing components that required high performance, memory
management control, and direct hardware interfacing. C enabled low-level
programming such as pointer arithmetic, manual memory allocation, and
TECH STACK USED
bitwise operations, which were essential for optimizing resource usage. C++,
being a superset of C, allowed object-oriented programming (OOP), enabling
better code modularity, maintainability, and reusability through the use of
classes, inheritance, polymorphism, and encapsulation.

5.1.2 Key Features Utilized

● Pointers and Memory Management

Manual memory handling using malloc, calloc, free (in C) and new,
delete (in C++) allowed for optimized resource allocation.

● Object-Oriented Programming

Classes, inheritance, and abstraction were used to implement modular


components. This structure enhanced readability and reduced code
duplication.

● Standard Template Library (STL)

24
Containers like vector, map, set, and algorithms from STL helped manage
dynamic data efficiently and provided tested, reliable implementations.

● File Handling

The project included reading and writing to configuration files and logs
using fstream in C++, ensuring persistence and traceability.

Figure 10 : Key features of C++

5.1.3 Tools and Utilities

● GDB (GNU Debugger)


Used for debugging runtime issues, breakpoints, watch expressions, and
stack traces. It allowed line-by-line inspection of variables and control flow.
● Valgrind
Crucial for detecting memory leaks, uninitialized memory usage, and invalid
memory access. This tool helped ensure memory safety throughout the
project.
● AddressSanitizer (ASan)
Integrated with the build system to detect out-of-bound memory access and
use-after-free errors at runtime.
● Makefiles
Used to automate the build process and manage dependencies across
modules. This helped maintain a consistent and efficient development
workflow.

25
5.1.4 Applications in the Project

C/C++ played a central role in the development of:

● Core Backend Logic: High-performance modules responsible for


computation-heavy tasks.
● Data Parsing and Processing: Efficient manipulation of binary and text
data streams using C-style I/O operations.
● Performance Optimization: Critical parts were profiled and optimized
using C constructs and inline assembly (if applicable).
● Interfacing with System Resources: Direct interaction with memory and
system calls was handled using low-level C constructs.

5.1.5 Challenges and Learning Outcomes

One of the key challenges was managing memory efficiently while avoiding
segmentation faults and memory leaks. This required an in-depth
understanding of pointer operations, dynamic memory allocation, and
debugging with tools like Valgrind and ASan. The experience also reinforced
best practices such as:

● Initializing pointers and variables


● Avoiding memory leaks and dangling pointers
● Understanding stack vs. heap allocation
● Writing reusable and modular C++ classes
Furthermore, using the Standard Template Library (STL) encouraged
thinking in terms of abstractions and algorithms, enhancing the quality and
efficiency of the codebase.

5.2 LINUX

Linux is a widely-used open-source operating system known for its stability, security,
and developer-friendly environment. During the internship, Linux served as the
primary development platform due to its compatibility with low-level programming
tools, efficient memory management, and powerful command-line utilities.

5.2.1 Why Linux Was Used

26
The project involved system-level programming, debugging, and performance
analysis—areas where Linux excels. It provided direct access to system
resources, support for manual memory handling in C/C++, and integration
with essential development tools. Linux also offered better compatibility with
utilities like GCC, GDB, Valgrind, and Make, all of which played a critical
role in the development workflow.

5.2.2 Key Tools and Features

● Terminal (Bash Shell): Used for compiling code, running programs,


managing files, and automation via shell scripting.
● GCC (GNU Compiler Collection): Primary compiler for C/C++ code,
offering fine-grained control over compilation and debugging options.
● GDB & Valgrind: Native support for these tools enabled effective debugging
and memory analysis.
● Makefiles: Helped automate builds and manage module dependencies.
● File Permissions & Process Monitoring: Used commands like chmod, ps,
and top to manage execution rights and monitor resource usage.

5.2.3 Benefits in the Project

Linux’s command-line efficiency, open-source toolchain, and low-level


control enabled a streamlined and productive development process. Its
lightweight environment also allowed better performance benchmarking and
facilitated clean integration of C/C++ modules with debugging workflows.

5.3 VISUAL STUDIO CODE (VS CODE)

Visual Studio Code (VS Code) is a lightweight yet powerful source-code editor
developed by Microsoft. It offers built-in support for numerous programming
languages and tools, making it an ideal Integrated Development Environment (IDE)
for both system-level and high-level development. During the internship, VS Code
was used as the primary code editor due to its speed, extensibility, and developer-
friendly features.

5.3.1 Why VS Code Was Used

27
VS Code provided a seamless and efficient environment for editing,
debugging, and managing project files. Its support for C/C++, Python, Shell
scripting, and Git made it versatile for multi-language development.
Integration with extensions like C/C++ IntelliSense, CodeLLDB, and Python
enhanced the development experience by offering intelligent code completion,
inline error detection, and real-time debugging.

5.3.2 Key Features Utilized

● Syntax Highlighting & IntelliSense: Helped with code readability, auto-


completion, and inline documentation.
● Integrated Terminal: Allowed execution of shell commands, builds, and
scripts without leaving the editor.
● Git Integration: Streamlined version control operations such as commits,
pushes, pulls, and branching.
● Extensions: Utilized extensions like C/C++ by Microsoft, Live Share for
collaboration, and Code Runner for quick code execution.
● Debugging Tools: Enabled breakpoints, watch expressions, call stack
inspection, and variable tracking for C++ and Python.

5.3.3 Benefits in the Project

VS Code’s performance and customization made it ideal for iterative


development. It helped reduce context switching between tools by combining
code editing, version control, and debugging into a single interface. This
improved productivity, ensured cleaner code, and enhanced collaboration
during the development cycle.

5.4 PERFORCE (HELIX CORE)

Perforce, officially known as Helix Core, is a high-performance version control


system designed for large-scale enterprise development. It is widely used in industries
where teams collaborate on large codebases with frequent changes. During the
internship, Perforce was used for source control, enabling efficient collaboration,
version tracking, and change management throughout the development lifecycle.

5.4.1 Why Perforce Was Used

28
The project involved working with multiple developers, large code files, and
frequent updates. Perforce was chosen for its ability to handle large binary
assets and codebases with speed and reliability. Its centralized model allowed
tight control over code changes while offering features like changelists,
shelving, and file locking—crucial for preventing merge conflicts and
ensuring consistency.

5.4.2 Key Features Utilized

● Changelists: Grouped related file changes for better organization and review.
● Shelving: Temporarily stored changes without committing, useful for code
reviews and task switching.
● File Locking: Prevented simultaneous edits on critical files.
● Workspace Management: Mapped local file structures to depot locations for
smooth development and deployment.
● Integration with VS Code: Used extensions to view diffs, submit changelists,
and resolve conflicts directly from the IDE.

5.4.3 Benefits in the Project

Perforce enhanced collaboration by ensuring all contributors had access to the


latest, most stable code. It minimized conflicts through centralized control,
improved traceability through changelists, and maintained code quality with
structured check-ins. Its enterprise-grade performance ensured reliability even
during large-scale builds and frequent updates.

5.5 REVIEW BOARD

Review Board is a web-based code review tool that helps teams improve code
quality through collaborative peer reviews. It streamlines the review process by
allowing developers to submit code changes, comment on specific lines, and track
revisions over time. During the internship, Review Board played a key role in
maintaining code standards, catching bugs early, and encouraging team-based
feedback on submitted changes.

5.5.1 Why Review Board Was Used

29
As part of a collaborative development workflow, it was essential to have a
platform that facilitated systematic code reviews. Review Board integrated
well with Perforce, allowing changelists to be directly uploaded for review

before being submitted to the repository. This ensured that every code change
was examined for correctness, style, and impact, reducing errors and
improving maintainability.

Figure 11 : Reviewboard GUI

5.5.2 Key Features Utilized

● Inline Code Comments: Allowed reviewers to comment on specific lines,


improving the precision of feedback.

● Change History & Diff Viewer: Displayed differences between revisions to


help understand the context of changes.

● Status Tracking: Provided a clear workflow with “Ship It”, “Needs Fixing”,
or “Open Issues” statuses.

● Perforce Integration: Enabled developers to upload pending changelists


directly from the terminal or GUI using rbt post.

● Email Notifications: Kept team members informed of review updates and


approvals.

5.5.3 Benefits in the Project

Review Board improved code quality by enforcing thorough review practices.


It helped identify logic errors, code smells, and non-adherence to coding

30
standards early in the development cycle. The asynchronous review model
also ensured that feedback was timely and documented, making collaboration
easier and more transparent across the team.

5.6 TIGERVNC VIEWER

TigerVNC (Virtual Network Computing) Viewer is a high-performance, open-


source remote desktop application that allows users to connect to and control remote
machines via a graphical interface. During the internship, TigerVNC Viewer was
used to remotely access development environments and Linux servers that hosted
tools and codebases, especially when working from different network zones or
systems without direct desktop access.

5.6.1 Why TigerVNC Viewer Was Used

The internship setup often required interacting with remote Linux machines
that had specific toolchains, licensed software, or test environments not
available locally. TigerVNC Viewer provided a reliable way to visually access
and operate these remote desktops, enabling development, testing, and
debugging without being physically present or requiring local installation of

all tools.

Figure 12 : TigerVNC Viewer Connection Window

5.6.2 Key Features Utilized

● Remote GUI Access: Accessed full Linux desktop environments remotely,


with graphical interfaces for tools like VS Code, terminals, and file
browsers.

31
● Session Persistence: Allowed ongoing sessions to be resumed even after
network interruptions or system restarts.
● Lightweight & Fast: Delivered a responsive remote desktop experience even
over limited bandwidth connections.
● Cross-Platform Support: Used on Windows systems to access Linux
development environments seamlessly.
● Secure Connections: Integrated with SSH tunneling for secure remote
access.

5.6.3 Benefits in the Project

TigerVNC enabled seamless collaboration and remote development across


geographically distributed systems. It reduced dependency on local setups,
simplified testing on target environments, and allowed quick troubleshooting
and deployment tasks. Its visual interface helped speed up tasks that were
cumbersome via CLI alone, making remote work significantly more efficient.

5.7 VALGRIND

Valgrind is an instrumentation framework for building dynamic analysis tools. Its


most widely used tool, Memcheck, detects memory management issues such as
leaks, invalid accesses, and use of uninitialized memory. During the internship,
Valgrind was essential for ensuring memory safety and performance integrity in C/C+
+ code.

5.7.1 Why Valgrind Was Used

Since the project involved system-level C/C++ programming with manual


memory allocation, detecting memory leaks and pointer-related bugs was
critical. Valgrind enabled in-depth analysis of runtime behavior, helping
identify subtle issues that could lead to crashes, undefined behavior, or
memory inefficiencies.

5.7.2 Key Features Utilized

● Memcheck: Detected memory leaks, out-of-bounds reads/writes, and use-


after-free errors.

32
● Leak Summary: Provided categorized reports (definite, possible, reachable
leaks) for targeted fixes.

● Stack Traces: Helped locate the exact lines and call stacks responsible for
memory errors.

● Suppression Files: Used to ignore known harmless warnings to keep


reports clean.

● Integration with GDB: Combined with GDB for in-depth debugging of


faulty memory usage.

5.7.3 Benefits in the Project

Valgrind significantly improved code reliability by exposing hidden memory


issues early in development. It helped enforce best practices in memory
handling, reduced runtime crashes, and ensured efficient resource usage.
Regular usage of Valgrind became part of the debugging workflow, leading to
cleaner, more stable, and production-ready code.

5.8 CSH (C Shell)

CSH, or C Shell, is a Unix shell developed as an alternative to the Bourne shell (sh).
It provides a C-like syntax for scripting and interactive use, along with features like
command history, job control, and scripting capabilities. During the internship, CSH
was used primarily for automating build steps, running environment setup scripts, and
executing repeatable development tasks on Linux systems.

5.8.1 Why CSH Was Used

In the project environment, several existing scripts and system tools were
implemented using CSH. Leveraging CSH enabled compatibility with legacy
scripts while offering a structured and familiar syntax for users with C
programming backgrounds. It allowed for quick prototyping and modification
of automated workflows in a Unix terminal setting.

5.8.2 Key Features Utilized

● Alias and History Support: Simplified frequently used commands and


enhanced productivity in the terminal.

33
● Script Automation: Created and modified .csh scripts to automate
environment setup, file backups, and test runs.

● Control Flow Statements: Used if, switch, while, and foreach for decision-
making and looping in scripts.

● Job Control: Managed background processes and interactive job


suspension/resumption.

● Environment Configuration: Modified environment variables like PATH,


LD_LIBRARY_PATH, and EDITOR.

5.8.3 Benefits in the Project

CSH streamlined repetitive command-line tasks and improved efficiency in


managing development environments. It provided readable and maintainable
scripting support, helping bridge automation with manual terminal usage.
Despite newer alternatives, its compatibility with existing tooling made it a
valuable component of the development stack.

5.9 PERL

Perl is a high-level, interpreted programming language known for its powerful text-
processing capabilities and support for scripting, system administration, and rapid
prototyping. During the internship, Perl was primarily used for automating test
workflows, parsing logs, and manipulating configuration files in large-scale build
environments.

5.9.1 Why Perl Was Used

Perl was integrated into the existing infrastructure for scripting tasks involving
file parsing, data extraction, and dynamic file generation. Its ability to handle
regular expressions and manipulate large text files with minimal code made it
ideal for post-processing logs, generating reports, and customizing
configuration setups in the project.

5.9.2 Key Features Utilized

● Regular Expressions: Used extensively for parsing structured logs and


extracting relevant data.

34
● Text & File Processing: Efficiently read, wrote, and edited files using built-
in functions like open, print, and chomp.

● CPAN Modules: Leveraged modules from the Comprehensive Perl Archive


Network to extend functionality.

● One-Liners: Used quick command-line Perl scripts to perform ad-hoc file


manipulations.

● Cross-Platform Execution: Perl scripts ran consistently across Linux-based


environments used in the project.

5.9.3 Benefits in the Project

Perl simplified many scripting tasks that would have been complex in other
languages. Its expressive syntax and mature ecosystem enabled fast
development and integration with system tools. It played a vital role in
automating routine operations and improving the efficiency of build and
validation workflows.

35
Chapter 6

TESTING STRATEGY

The testing strategy employed a comprehensive approach to ensure the quality and
reliability of the code changes. The strategy included a multi-faceted approach to verify
the functionality, performance, and stability of the code.

7.1: ACCURACY TESTING

1. Functional Verification: Manually verifying the functionality of the code changes


after implementation was a crucial step in the testing strategy. This involved
testing the code changes in various scenarios to ensure they worked as expected.
The goal was to identify any functional issues or bugs that could impact the
overall performance of the system.
CHAPTER 6
2. Regression Testing: Running local regressions for all ports, which took 2 hours
each and included over 400 regressions per port. These regressions tested the code
INFORMATION ABOUT TESTING
changes in various scenarios, ensuring that they did not introduce any functional

STRATEGY
or performance issues. The goal was to identify any regressions that could impact
the overall quality of the code.

7.2: PERFORMANCE TESTING

1. Local Optimized Builds: Performing local optimized builds on various ports,


including LNX86, LNPPC, and LNA86, to ensure that the code changes did not
introduce any performance issues. The build times were ranging somewhere
between 18 minutes to 2 hours depending on type of port.

2. Debug Builds: Creating debug builds for all ports to facilitate further debugging
and testing. These builds included debugging symbols and allowed developers to
step through the code, examine variables, and identify issues. The build times
were ranging somewhere between 18 minutes to 2 hours depending on type of
port.

7.3: EDGE CASE TESTING

37
1. Error Handling: Testing the code's error handling mechanisms to ensure that they
handled unexpected inputs and edge cases correctly. This involved testing the
code's ability to handle errors and exceptions, and ensuring that it provided useful
error messages and logging information.

2. Boundary Value Testing: Testing the code's behavior at boundary values to ensure
that it handled these cases correctly. This involved testing the code's behavior at
the limits of its input range, and ensuring that it behaved correctly in these
scenarios.

7.4: USER EXPERIENCE TESTING

1. Virtuoso Testing: Testing Virtuoso for GUI-related changes to ensure that they
did not introduce any issues and that the EMIR report was generated successfully.
This involved testing the GUI functionality to ensure that it worked correctly and
provided the required information.

7.5: DEBUG AND TROUBLESHOOTING

1. Debugging: Using Valgrind and GDB for debugging issues and errors. Valgrind
is a powerful tool for debugging memory-related issues, such as memory leaks or
invalid memory accesses. GDB, on the other hand, is a general-purpose debugger
that allows developers to step through code, examine variables, and identify the
root cause of issues.

7.6: CODE QUALITY AND RELIABILITY

1. Code Review: Raising code reviews for mentors and incorporating their
suggestions to ensure code quality. This involved reviewing the code changes to
ensure they met the coding standards and best practices. The goal was to ensure
that the code changes were maintainable, efficient, and easy to understand.

2. DFT Updates: Adding or updating Design For Testability (DFT) components


wherever required. DFT is a design approach that ensures the code is testable and
maintainable. By adding or updating DFT components, developers could ensure
that the code changes were testable and meet the testing requirements.

7.7: DEPLOYMENT

38
1. CM Build and Regressions: Launching CM builds and running regressions to
ensure the code changes did not introduce any issues. These builds and
regressions tested the code changes in various scenarios, ensuring that they were
stable and reliable.

2. PV Regressions: Running PV regressions for different ports to ensure the code


changes were stable and reliable. This involved testing the code changes on
different platforms and verifying the code changes were stable and reliable.

3. ASAN Build: Launching ASAN builds to verify memory leakage and ensure the
code changes did not introduce any memory-related issues. ASAN is a memory
error detector that identifies memory-related issues, such as memory leaks or
invalid memory accesses.

In conclusion, the testing strategy employed a comprehensive approach to ensure the


quality and reliability of the code changes. By verifying functionality, debugging issues,
performing local optimized builds and regressions, testing GUI-related changes, and
running PV regressions and ASAN builds, developers could ensure that the code changes
were stable and reliable.

39
CHAPTER 7

SCREENSHOTS OF GUI

41
Figure 13: Library Manager of Voltus-xFi

Figure 14: Opening Assembler view

42
Figure 15 : Extraction Setup

43
Figure 16 : Simulation Files Setup

Figure 17 : EMIR Analysis Setup Form

44
Figure 18 : EMIR analysis plots

Figure 19 : Output Reports

45
Figure 20 : IR/EM Results Form

46
Figure 21 : IR drop plot on the Virtuoso layout

Figure 22 : EM violations in the Annotation Browser

47
Figure 23 : Pin2PinR calculations

Figure 24 : Viewing the worst violations in the layout

48
Figure 25 : Viewing the EM analysis results

49
Chapter 8

CONCLUSION AND FUTURE SCOPE

8.1 CONCLUSION

The comprehensive testing and debugging efforts across various projects have yielded
significant results. Through meticulous testing strategies, including functional
verification, performance testing, edge case testing, and user experience testing, we have
ensured the quality and reliability of our codebase. The debugging efforts, utilizing tools
like Valgrind, GDB, and ASAN, have helped identify and fix numerous issues, resulting
in a more stable and maintainable codebase. Specifically, the compiler warning setup and
fixing effort reduced warnings from 1583 to 28, demonstrating the effectiveness of our
systematic approach.

CHAPTER 8
The testing strategy employed a multi-faceted approach to verify the functionality,
performance, and stability of the code. The use of various testing tools and techniques,
such as regression testing, GUI testing, and ASAN builds, ensured that the code was
CONCLUSION AND FUTURE SCOPE
thoroughly tested and validated. The debugging efforts, using tools like Valgrind and
GDB, helped identify and fix memory-related issues, syntax errors, and other bugs.

The outcome of these efforts is a more robust, reliable, and maintainable codebase. The
reduction in compiler warnings and the fixing of bugs have improved the overall quality
of the code, making it more efficient and less prone to errors. The testing and debugging
efforts have also helped identify areas for improvement, enabling us to refine our
development process and ensure that future projects are completed with even higher
quality.

8.2 FUTURE SCOPE

The future scope of our projects is vast and exciting. As we continue to develop and
maintain our codebase, we aim to further enhance its quality, reliability, and
maintainability.

Some of the key areas we will focus on in the future include:

51
1. Sustained Code Quality: We will continue to prioritize code quality and
maintainability by regularly reviewing and refining our codebase. This will
involve ongoing testing and debugging efforts, as well as the adoption of best
practices and coding standards.
2. Automated Testing: We plan to expand our automated testing frameworks to
cover more scenarios and ensure that issues are caught early in the development
cycle. This will enable us to reduce the number of bugs and errors in our code,
making it more reliable and maintainable.
3. Continuous Integration and Deployment: We will further integrate continuous
integration and deployment tools to streamline our development process and
reduce manual errors. This will enable us to deliver high-quality code more
efficiently and effectively.
4. Collaboration and Knowledge Sharing: We will foster collaboration among team
members to share knowledge, best practices, and lessons learned. This will enable
us to leverage each other's strengths and expertise, ensuring that our collective
knowledge and skills continue to grow.
5. Innovation and Improvement: We will continuously seek opportunities for
innovation and improvement, exploring new technologies, tools, and
methodologies to stay ahead of the curve. This will enable us to deliver cutting-
edge solutions that meet the evolving needs of our users.

By focusing on these areas, we aim to further enhance the quality, reliability, and
maintainability of our projects, driving long-term success and growth. Our goal is to
deliver high-quality solutions that meet the needs of our users, while also ensuring that
our codebase remains robust, efficient, and maintainable.

In conclusion, the testing and debugging efforts across our projects have been highly
successful, resulting in a more stable and maintainable codebase. As we look to the
future, we are excited to build on this success and continue to deliver high-quality
solutions that meet the evolving needs of our users. With a focus on sustained code
quality, automated testing, continuous integration and deployment, collaboration and
knowledge sharing, and innovation and improvement, we are confident that our projects
will continue to thrive and grow.

52
REFERENCES

1. C++ Reference. Available at: https://en.cppreference.com

2. Cadence’s Internal Documentation

3. Valgrind Documentation. Available at: http://valgrind.org/docs/manual/manual.html

4. GDB Documentation. Available at:


https://sourceware.org/gdb/current/onlinedocs/gdb/

5. AddressSanitizer (ASan) Documentation. Available at:


https://clang.llvm.org/docs/AddressSanitizer.html

6. C++ Best Practices. Available at: https://github.com/lefticus/cppbestpractices

7. Linux Kernel Documentation. Available at: https://www.kernel.org/doc/html/latest/

8. Perl Documentation. Available at: https://perldoc.perl.org

9. CSH (C Shell) User Guide. Available at:


https://www.cs.cmu.edu/~./quake-csf/onlamp/csh.html

10. Stack Overflow – Developer Q&A. Available at: https://stackoverflow.com

11. Git Documentation. Available at: https://git-scm.com/doc

12. Perforce Documentation. Available at: https://www.perforce.com/manuals

53
BRIEF PROFILE OF STUDENT

Student Name: Vansh Sardana

Roll No: 21001017067

Course: Bachelors of Technology (B.Tech)

Branch: Electronics Engineering with specialization in Internet of

Things (2021-2025)

Email Address: vanshsrdn25703@gmail.com

Phone No.(M): +91 7011263403

Project Title: Voltus-xFi : Research and Development

Company Name: Cadence Design Systems, Noida

54

You might also like