Industrial Training Report
Industrial Training Report
on
VOLTUS-xFi : RESEARCH AND DEVELOPMENT
Submitted By: -
Vansh Sardana (21001017067)
2
CANDIDATE’S DECLARATION
I hereby certify that the work which is being presented in this project report titled
“Voltus-xFi : Research and Development” submitted to “J. C. Bose University of
Science & Technology, YMCA, Faridabad”, is an authentic record of my own work
carried out in the company “Cadence Design Systems, Noida” The work contained in
this thesis has not been submitted to any other University of Institute.
Student Signature :
3
EXAMINERS EVALUATION
Internal Examiner
Signature: ___________________
Name: ___________________
Designation:- ___________________
Date: ___________________
External Examiner
Signature: ___________________
Name: ___________________
Designation:- ___________________
Date: ___________________
4
ACKNOWLEDGEMENT
I would also like to express my deepest appreciation to Dr. Munish Vashishth, Head of
the Department of Electronics Engineering, and Prof. Rajesh Kr. Ahuja, Training and
Placement Officer, for their invaluable contributions in providing me with an excellent
opportunity to pursue this internship. Their support and encouragement have been
instrumental in helping me secure this valuable experience.
I would also like to extend my thanks to Ms. Archana Jain (Faculty Mentor), Ms.
Archana Agarwal (Faculty Mentor) and all the faculty members of the Department of
Electronics Engineering at J.C. Bose University of Science and Technology, YMCA,
Faridabad for their constant encouragement and assistance in shaping my academic and
professional journey.
I express my heartfelt thanks to my parents, friends and colleagues for their constant
support, love and motivation.
Vansh Sardana
21001017067
5
6
TABLE OF CONTENTS
IV. Acknowledgement v
V. Table of Contents vi
7
5. Chapter 5 : Tech Stack Used 23
9. References 52
8
Chapter 1
COMPANY PROFILE
Website: https://www.cadence.com/
1. Virtuoso Platform
2
● Schematic Capture and Simulation: Provides an intuitive environment for
creating analog and mixed-signal designs and running detailed circuit
simulations.
● Layout and Physical Design: Offers advanced layout editing, automation,
and analog placement and routing tools to accelerate design closure.
● Mixed-Signal Verification: Integrates with simulation tools like Spectre
and supports behavioral modeling for mixed-signal SoCs.
● Design Rule Checking and Verification: Ensures manufacturability and
performance with integrated physical and electrical rule checking.
● Advanced Node Support: Optimized for cutting-edge process nodes like
5nm and below, enabling design scalability and performance.
3
● Logic Optimization and Area Reduction: Helps reduce silicon area and
improve performance.
● Multi-Scenario Synthesis: Optimizes designs for various process-voltage-
temperature (PVT) conditions.
● Incremental Synthesis: Shortens turnaround time for large SoC designs.
4
8. Allegro PCB Design Suite
9. Clarity 3D Solver
5
Chapter 2
INTRODUCTION TO PROJECT
7
TEAM STRUCTURE:
The motivation behind this project was multifaceted and driven by several key factors.
8
● Reducing Costs: By streamlining the codebase and reducing dependencies, we
expected to lower development and maintenance costs. This would enable the
company to allocate resources more effectively and invest in other areas of the
product.
● Improving Customer Satisfaction: Ultimately, the project's goal was to deliver a
better product to customers. By improving performance, reducing errors, and
enhancing overall quality, we aimed to increase customer satisfaction and build
loyalty.
Figure 3: Problem
Identification in
Voltus-xFi
1. Compile-Time Warnings
9
● The product's memory utilization was high, with competitors providing
similar functionality with memory usage between 60 GBs and 100 GBs.
● Cadence's tool was using approximately 180 GBs, making it less
competitive and potentially impacting performance.
● There was no automated mechanism to track and report daily build status
and warning trends, making it challenging for the team to identify issues
early on and track progress.
Each of these issues presented a unique challenge, but addressing them would have a
significant impact on the product's performance, maintainability, and overall quality.
The objective of this project was to improve the overall efficiency, maintainability, and
reliability of Voltus-xFi by achieving the following specific goals:
2. Dependency Reduction
3. Memory Optimization
10
4. Automated Daily Build Tracking
11
Chapter 3
REQUIREMENT ANALYSIS
The requirement analysis for this project involved a thorough examination of the Voltus-
xFi codebase and the identification of key objectives and requirements for the tasks
performed. The primary goals of the project were to reduce unnecessary dependencies,
optimize memory usage, minimize compile-time warnings, and implement an automated
daily build tracking process.
3.1 REQUIREMENTS
CHAPTER 3
● Analyze the codebase to identify areas for improvement, including unnecessary
dependencies, memory-intensive code, and compile-time warnings.
REQUIREMENT
● Implement changes to ANALYSIS
reduce dependencies, memory utilization, and warnings,
while ensuring that the functionality and behavior of the product are not impacted.
● Use debugging tools such as GDB and Valgrind to identify and fix complex issues.
● Develop a script to automate daily build tracking and reporting, providing
visibility into build status and warning trends.
13
● Ensuring that the changes made to the codebase do not impact the functionality
and behavior of the product.
● Optimizing memory usage to make the product more efficient and competitive
with minimum reduction in precision.
3.4 METHODOLOGY
To achieve the project objectives and meet the requirements, the following methodology
was employed:
● Code analysis: The codebase was thoroughly analyzed to identify areas for
improvement.
● Code refactoring: Changes were made to the codebase to reduce dependencies,
memory utilization, and warnings.
● Debugging: Debugging tools such as GDB and Valgrind were used to identify and
fix complex issues.
● Automation: A script was developed to automate daily build tracking and
reporting.
By understanding the requirements and objectives of the project, the team was able to
deliver the expected outcomes and achieve the project goals. The requirement analysis
played a critical role in ensuring that the project was well-planned and executed, and that
the deliverables met the needs of the stakeholders.
14
Chapter 4
4.1 BRIEF
Across all projects, a structured approach was employed to ensure successful outcomes.
Key design considerations included:
By considering these factors, the projects were able to achieve significant benefits,
including improved efficiency, reduced memory utilization, minimized warnings, and
enhanced productivity..
CHAPTER 4
4.2 DEPENDENCY OPTIMISATION
16
2. Kit Usage Identification: A reference guide with error messages was
created to identify required kits.
3. Dependency Reduction: 143 unrequired kits were identified and removed,
reducing dependencies by 45% and improving product size and
maintainability.
1. API Behavior Analysis: The API behavior was confirmed, and multiple
edge cases were verified.
17
● Script-Based Replacement: Used for 4000+ occurrences of simple
APIs, such as data types or simple functions.
18
1. Memory Profiling: The first step was to identify areas of high memory utilization
in the code. This was done through memory profiling, which helped to pinpoint
specific classes and functions that were consuming excessive memory.
19
Figure 7: Memory Optimization Process
● Data Type Optimization: Changing data types to more efficient ones, such
as from double to float.
● Code Refactoring: Refactoring code to reduce unnecessary memory
duplication and improve efficiency.
● Report Generation Optimization: Modifying code to generate reports only
for specified currents mentioned in the config file.
20
2. Code Review: A code review was performed to understand the context and
impact of each warning.
3. Warning Fixing: Warnings were fixed by addressing the underlying issues,
such as unused variables, type mismatches, syntax errors
4. Code Refactoring: Code was refactored to improve readability,
maintainability, and reduce potential issues.
After fixing the warnings, the number of warnings was reduced to only 28
warnings from a total of 1583 warnings, which were due to other kits or more
complex issues that required further investigation and resolution.
1. Script Development: The script was developed to perform the following tasks:
● Identify Code Base: Determine the code base, version, and port being
used.
21
● Run Make Commands: Execute the required make commands based on
the identified parameters.
● Save Logs: Save the logs generated during the build process.
2. Warning Comparison: A Perl script was developed to:
● Count Warnings: Count the number of warnings generated during the
build process.
● Compare Warnings: Compare the current warnings with the previous
warnings to identify any changes.
3. Notification: An email was sent to the team group on Outlook with the results
of the build and warning comparison.
4. Automation: The script was automated using a cron job, ensuring that it ran at
regular intervals without manual intervention.
22
Chapter 5
5.1 C / C++
C and C++ are powerful general-purpose programming languages that form the
backbone of many modern software systems, especially those requiring performance,
control over hardware, and resource efficiency. In the context of this internship
project, C and C++ were chosen for their close-to-metal capabilities, deterministic
behavior, and efficient execution, particularly in scenarios involving memory-
sensitive operations and low-level system interactions.
The choice of C and C++ was driven by the nature of the project, which
CHAPTER 5
involved developing components that required high performance, memory
management control, and direct hardware interfacing. C enabled low-level
programming such as pointer arithmetic, manual memory allocation, and
TECH STACK USED
bitwise operations, which were essential for optimizing resource usage. C++,
being a superset of C, allowed object-oriented programming (OOP), enabling
better code modularity, maintainability, and reusability through the use of
classes, inheritance, polymorphism, and encapsulation.
Manual memory handling using malloc, calloc, free (in C) and new,
delete (in C++) allowed for optimized resource allocation.
● Object-Oriented Programming
24
Containers like vector, map, set, and algorithms from STL helped manage
dynamic data efficiently and provided tested, reliable implementations.
● File Handling
The project included reading and writing to configuration files and logs
using fstream in C++, ensuring persistence and traceability.
25
5.1.4 Applications in the Project
One of the key challenges was managing memory efficiently while avoiding
segmentation faults and memory leaks. This required an in-depth
understanding of pointer operations, dynamic memory allocation, and
debugging with tools like Valgrind and ASan. The experience also reinforced
best practices such as:
5.2 LINUX
Linux is a widely-used open-source operating system known for its stability, security,
and developer-friendly environment. During the internship, Linux served as the
primary development platform due to its compatibility with low-level programming
tools, efficient memory management, and powerful command-line utilities.
26
The project involved system-level programming, debugging, and performance
analysis—areas where Linux excels. It provided direct access to system
resources, support for manual memory handling in C/C++, and integration
with essential development tools. Linux also offered better compatibility with
utilities like GCC, GDB, Valgrind, and Make, all of which played a critical
role in the development workflow.
Visual Studio Code (VS Code) is a lightweight yet powerful source-code editor
developed by Microsoft. It offers built-in support for numerous programming
languages and tools, making it an ideal Integrated Development Environment (IDE)
for both system-level and high-level development. During the internship, VS Code
was used as the primary code editor due to its speed, extensibility, and developer-
friendly features.
27
VS Code provided a seamless and efficient environment for editing,
debugging, and managing project files. Its support for C/C++, Python, Shell
scripting, and Git made it versatile for multi-language development.
Integration with extensions like C/C++ IntelliSense, CodeLLDB, and Python
enhanced the development experience by offering intelligent code completion,
inline error detection, and real-time debugging.
28
The project involved working with multiple developers, large code files, and
frequent updates. Perforce was chosen for its ability to handle large binary
assets and codebases with speed and reliability. Its centralized model allowed
tight control over code changes while offering features like changelists,
shelving, and file locking—crucial for preventing merge conflicts and
ensuring consistency.
● Changelists: Grouped related file changes for better organization and review.
● Shelving: Temporarily stored changes without committing, useful for code
reviews and task switching.
● File Locking: Prevented simultaneous edits on critical files.
● Workspace Management: Mapped local file structures to depot locations for
smooth development and deployment.
● Integration with VS Code: Used extensions to view diffs, submit changelists,
and resolve conflicts directly from the IDE.
Review Board is a web-based code review tool that helps teams improve code
quality through collaborative peer reviews. It streamlines the review process by
allowing developers to submit code changes, comment on specific lines, and track
revisions over time. During the internship, Review Board played a key role in
maintaining code standards, catching bugs early, and encouraging team-based
feedback on submitted changes.
29
As part of a collaborative development workflow, it was essential to have a
platform that facilitated systematic code reviews. Review Board integrated
well with Perforce, allowing changelists to be directly uploaded for review
before being submitted to the repository. This ensured that every code change
was examined for correctness, style, and impact, reducing errors and
improving maintainability.
● Status Tracking: Provided a clear workflow with “Ship It”, “Needs Fixing”,
or “Open Issues” statuses.
30
standards early in the development cycle. The asynchronous review model
also ensured that feedback was timely and documented, making collaboration
easier and more transparent across the team.
The internship setup often required interacting with remote Linux machines
that had specific toolchains, licensed software, or test environments not
available locally. TigerVNC Viewer provided a reliable way to visually access
and operate these remote desktops, enabling development, testing, and
debugging without being physically present or requiring local installation of
all tools.
31
● Session Persistence: Allowed ongoing sessions to be resumed even after
network interruptions or system restarts.
● Lightweight & Fast: Delivered a responsive remote desktop experience even
over limited bandwidth connections.
● Cross-Platform Support: Used on Windows systems to access Linux
development environments seamlessly.
● Secure Connections: Integrated with SSH tunneling for secure remote
access.
5.7 VALGRIND
32
● Leak Summary: Provided categorized reports (definite, possible, reachable
leaks) for targeted fixes.
● Stack Traces: Helped locate the exact lines and call stacks responsible for
memory errors.
CSH, or C Shell, is a Unix shell developed as an alternative to the Bourne shell (sh).
It provides a C-like syntax for scripting and interactive use, along with features like
command history, job control, and scripting capabilities. During the internship, CSH
was used primarily for automating build steps, running environment setup scripts, and
executing repeatable development tasks on Linux systems.
In the project environment, several existing scripts and system tools were
implemented using CSH. Leveraging CSH enabled compatibility with legacy
scripts while offering a structured and familiar syntax for users with C
programming backgrounds. It allowed for quick prototyping and modification
of automated workflows in a Unix terminal setting.
33
● Script Automation: Created and modified .csh scripts to automate
environment setup, file backups, and test runs.
● Control Flow Statements: Used if, switch, while, and foreach for decision-
making and looping in scripts.
5.9 PERL
Perl is a high-level, interpreted programming language known for its powerful text-
processing capabilities and support for scripting, system administration, and rapid
prototyping. During the internship, Perl was primarily used for automating test
workflows, parsing logs, and manipulating configuration files in large-scale build
environments.
Perl was integrated into the existing infrastructure for scripting tasks involving
file parsing, data extraction, and dynamic file generation. Its ability to handle
regular expressions and manipulate large text files with minimal code made it
ideal for post-processing logs, generating reports, and customizing
configuration setups in the project.
34
● Text & File Processing: Efficiently read, wrote, and edited files using built-
in functions like open, print, and chomp.
Perl simplified many scripting tasks that would have been complex in other
languages. Its expressive syntax and mature ecosystem enabled fast
development and integration with system tools. It played a vital role in
automating routine operations and improving the efficiency of build and
validation workflows.
35
Chapter 6
TESTING STRATEGY
The testing strategy employed a comprehensive approach to ensure the quality and
reliability of the code changes. The strategy included a multi-faceted approach to verify
the functionality, performance, and stability of the code.
STRATEGY
or performance issues. The goal was to identify any regressions that could impact
the overall quality of the code.
2. Debug Builds: Creating debug builds for all ports to facilitate further debugging
and testing. These builds included debugging symbols and allowed developers to
step through the code, examine variables, and identify issues. The build times
were ranging somewhere between 18 minutes to 2 hours depending on type of
port.
37
1. Error Handling: Testing the code's error handling mechanisms to ensure that they
handled unexpected inputs and edge cases correctly. This involved testing the
code's ability to handle errors and exceptions, and ensuring that it provided useful
error messages and logging information.
2. Boundary Value Testing: Testing the code's behavior at boundary values to ensure
that it handled these cases correctly. This involved testing the code's behavior at
the limits of its input range, and ensuring that it behaved correctly in these
scenarios.
1. Virtuoso Testing: Testing Virtuoso for GUI-related changes to ensure that they
did not introduce any issues and that the EMIR report was generated successfully.
This involved testing the GUI functionality to ensure that it worked correctly and
provided the required information.
1. Debugging: Using Valgrind and GDB for debugging issues and errors. Valgrind
is a powerful tool for debugging memory-related issues, such as memory leaks or
invalid memory accesses. GDB, on the other hand, is a general-purpose debugger
that allows developers to step through code, examine variables, and identify the
root cause of issues.
1. Code Review: Raising code reviews for mentors and incorporating their
suggestions to ensure code quality. This involved reviewing the code changes to
ensure they met the coding standards and best practices. The goal was to ensure
that the code changes were maintainable, efficient, and easy to understand.
7.7: DEPLOYMENT
38
1. CM Build and Regressions: Launching CM builds and running regressions to
ensure the code changes did not introduce any issues. These builds and
regressions tested the code changes in various scenarios, ensuring that they were
stable and reliable.
3. ASAN Build: Launching ASAN builds to verify memory leakage and ensure the
code changes did not introduce any memory-related issues. ASAN is a memory
error detector that identifies memory-related issues, such as memory leaks or
invalid memory accesses.
39
CHAPTER 7
SCREENSHOTS OF GUI
41
Figure 13: Library Manager of Voltus-xFi
42
Figure 15 : Extraction Setup
43
Figure 16 : Simulation Files Setup
44
Figure 18 : EMIR analysis plots
45
Figure 20 : IR/EM Results Form
46
Figure 21 : IR drop plot on the Virtuoso layout
47
Figure 23 : Pin2PinR calculations
48
Figure 25 : Viewing the EM analysis results
49
Chapter 8
8.1 CONCLUSION
The comprehensive testing and debugging efforts across various projects have yielded
significant results. Through meticulous testing strategies, including functional
verification, performance testing, edge case testing, and user experience testing, we have
ensured the quality and reliability of our codebase. The debugging efforts, utilizing tools
like Valgrind, GDB, and ASAN, have helped identify and fix numerous issues, resulting
in a more stable and maintainable codebase. Specifically, the compiler warning setup and
fixing effort reduced warnings from 1583 to 28, demonstrating the effectiveness of our
systematic approach.
CHAPTER 8
The testing strategy employed a multi-faceted approach to verify the functionality,
performance, and stability of the code. The use of various testing tools and techniques,
such as regression testing, GUI testing, and ASAN builds, ensured that the code was
CONCLUSION AND FUTURE SCOPE
thoroughly tested and validated. The debugging efforts, using tools like Valgrind and
GDB, helped identify and fix memory-related issues, syntax errors, and other bugs.
The outcome of these efforts is a more robust, reliable, and maintainable codebase. The
reduction in compiler warnings and the fixing of bugs have improved the overall quality
of the code, making it more efficient and less prone to errors. The testing and debugging
efforts have also helped identify areas for improvement, enabling us to refine our
development process and ensure that future projects are completed with even higher
quality.
The future scope of our projects is vast and exciting. As we continue to develop and
maintain our codebase, we aim to further enhance its quality, reliability, and
maintainability.
51
1. Sustained Code Quality: We will continue to prioritize code quality and
maintainability by regularly reviewing and refining our codebase. This will
involve ongoing testing and debugging efforts, as well as the adoption of best
practices and coding standards.
2. Automated Testing: We plan to expand our automated testing frameworks to
cover more scenarios and ensure that issues are caught early in the development
cycle. This will enable us to reduce the number of bugs and errors in our code,
making it more reliable and maintainable.
3. Continuous Integration and Deployment: We will further integrate continuous
integration and deployment tools to streamline our development process and
reduce manual errors. This will enable us to deliver high-quality code more
efficiently and effectively.
4. Collaboration and Knowledge Sharing: We will foster collaboration among team
members to share knowledge, best practices, and lessons learned. This will enable
us to leverage each other's strengths and expertise, ensuring that our collective
knowledge and skills continue to grow.
5. Innovation and Improvement: We will continuously seek opportunities for
innovation and improvement, exploring new technologies, tools, and
methodologies to stay ahead of the curve. This will enable us to deliver cutting-
edge solutions that meet the evolving needs of our users.
By focusing on these areas, we aim to further enhance the quality, reliability, and
maintainability of our projects, driving long-term success and growth. Our goal is to
deliver high-quality solutions that meet the needs of our users, while also ensuring that
our codebase remains robust, efficient, and maintainable.
In conclusion, the testing and debugging efforts across our projects have been highly
successful, resulting in a more stable and maintainable codebase. As we look to the
future, we are excited to build on this success and continue to deliver high-quality
solutions that meet the evolving needs of our users. With a focus on sustained code
quality, automated testing, continuous integration and deployment, collaboration and
knowledge sharing, and innovation and improvement, we are confident that our projects
will continue to thrive and grow.
52
REFERENCES
53
BRIEF PROFILE OF STUDENT
Things (2021-2025)
54