GOPI KRISHNA P
PROFESSIONAL SUMMARY:
   •   Over 10 years of experience in areas of Data Science, Machine Learning, Predictive Modeling,
       Data Analytics, Data Engineering, & Data Visualization.
   •   Experience in Software Development Methodologies such as Waterfall, Agile, Scrum. Managed
       and delivered ML team to handle AI/Business Intelligence projects.
   •   Experienced in implementing solutions using Large Language Models (LLM). Conversant with
       Generative AI – Azure GPT-3.5 & 4. Experienced with other LLM products– Hugging Face,
       Anthropic, BERT, Google Vertex AI, Co-Pilot etc.,
   •   Experience in implementing supervising predictive models, classification & regression models,
       recommendation systems etc., using state of art ML/AI approach
   •   Experienced with NLP techniques Topic Modelling, Sentiment Analysis, Natural language
       conversation/synthesis models, Quality Audit models.
   •   Expertise in transforming business resources and requirements into manageable data formats
       and analytical models, designing algorithms, building models, and developing data pipelines for
       end user consumption.
   •   Experience in designing, developing, and executing cutting-edge generative AI architectures
       (CNN, RNN, LSTM, GRU etc.) and models that aligned with industry best practices and met
       customer needs.
   •   Experience in usage of programming languages (Python, R), Business Intelligence tools
       (Tableau, Power BI) and knowledge of Data & platforms (Azure cloud/Databricks).
   •   Experience in building models using Machine Learning modules with AWS, Azure (Certified
       Azure data scientist-Associate), and Google Cloud Platform.
   •   Working Knowledge of building data pipelines for both Real time and Batch using CI/CD
       pipelines. Proficiency in deploying ML models to Azure/AWS cloud service and monitoring their
       performance.
   •   Leveraged process Mining techniques to analyze and optimize operations. working knowledge of
       Process Mining tools -Celonis and process optimization.
   •   Knowledge in containerizing applications using Docker and deploying them on Kubernetes
       clusters for efficient resource utilization and workload management.
TECHNICAL SKILLS
Languages              Python, R-Studio, SQL, SAS
BI Tools               Power BI, Tableau, QlikView, Excel & MS office tools
Database Platforms     Microsoft Azure suite, AWS suite, GCP, Celonis, Data bricks, Stream lit
/ Frameworks
EDUCATION
   •   Advanced Degree in Business Analytics from Indian School of Business (ISB), 2018
   •   Diploma in management (Operations) from St Joseph's college. Bangalore, India,2012
   •   Bachelor of Engineering from Osmania University, Hyderabad, India,2006
CERTIFICATION & COURSE
   •   Azure- DP-100-certified; AWS Sage maker certified.
   •   Courseera/Udemy certified-GEN.AI with LLM
   •   Simpli Learn certified R, SAS and python Professional.
   •   Udemy- Deep Learning and Artificial Intelligence
PROFESSIONAL EXPERIENCE:
Healthcare Payor & Providers
Lead Data Scientist                                                                          09/18 - Present
Clients: HMSA, Texas Health, Norton Healthcare, Medtronic, UH-one, BSW, Parkland, Norton etc.,
USA
    • Translating the Business problem into Analytics use case i.e. understand and evaluate the
        business requirement, qualify the analytics advantage and define the metrics for evaluation.
    • Usage of Generative AI tools: Azure ChatGPT 3,3.5 in building topic models for agent running
        notes on claim adjudication process in a BPAAS env. This has significantly saved ~268k$.
    • Series of Healthcare data associated with Claims adjudication process is fed into the Chat
        completion model of AZURE GPT 3.5 and relevant topic models for each of the claim data is
        extracted. This data is further fine-tuned to form an overall theme on the topic being covered in
        the text data. They are bucketized accordingly with respective topics and used by the operations
        team in simplifying their adjudication. This is implemented under Azure confidential cloud
        environment.
    • Worked in implementing Lang Chain techniques in machine learning models for natural language
        processing tasks - usage of RAG methods, knowledge in fine tuning of LLM’s & validation.
    • Similar model Usage of Generative AI tools: in extracting auditor information so to standardize the
        claim resolution process for Inquiries received on the claim adjustment. Optimized the resolution
        activity and helped save 40 hrs./ week work for an ongoing project.
    • Iterated with various Prompt Templates (open /closed/multiple choice/ Q & A/ Continuous) in
        Open AI to create multiple proof of concepts using LLM.
    • Recommended strategies so to increase the digital footprint and increase the collectible revenue
        in shorter intervals of time using the rights channels for communication analyzing the customer &
        Click data. build strategies to implement the channels appropriately and perform A/B testing.
    • Remote patient Monitoring system -Developed a recommender system using collaborative
        filtering approach and clustering techniques in determining the right target groups to market the
        available plans along with the devices.
    • Recommend the various strategies for the campaigns to be rolled out after analyzing click to
        penetration ratio for digital channels in Mred Program. Designed (A/B testing) while maintaining a
        control population to observe the incremental impact of the emails & segmented the impacted
        users.
    • Designed & developed the architecture for entire Debt-collection Restructuring model, End-to-End
        implementation of predictive model in classifying which claims would be penalized Interest with
        delayed processing to save substantiate loss incurred in Interest payments. Assessing the risk
        level of claims and helping prevent the interest levied on critical claims. Scoring the incoming
        claims against the incoming data for daily refresh of Data and updating the model every quarterly.
    • Built predictive models not limiting to Adjudication, built models for propensity to pay interest,
        propensity for claim denials, propensity to get errors/quality issues with traditional and advanced
        ML techniques. Integrated process mining and performed ML ops with process mining feature.
        Integration of the model with Flask to enable users to use it as a micro service.
    • Develop automated ETL pipelines to extract data from various sources, transform it, and load it
        into data warehouse. Monitor and optimize these pipelines for efficiency.
    • Extensive Usage of Machine Learning algorithms (supervised- logistic Regression, Decision
        trees, Random Forest, XG-Boost, SVM, Naïve Bayes, Unsupervised- KNN, K-mode and
        Hierarchical clustering, Association Mining) applied to predictive models for applicable use cases
        in a claim adjudication process.
Natural Conversation Analytics (NLP)||BTO||Media & Banking ||operations
Sr. Data Scientist
    • Understand Agent performance and Improve Customer handling of a key UK client to improve
        NPS score from 2.5 to 4.
    •   Implemented real-time Argo workflow pipelines and integrated them with machine learning
        models, ensuring seamless data-to-insights translation for business stakeholders.
    •   Handling data is a combination of unstructured and structured data from multiple sources and
        automating the cleaning using Python scripts.
    •   Transcribe audio files by using commercially available cloud platforms (Azure and Aws speech
        tool) to get the transcribed text. Text Analytics using natural language processing techniques is
        performed to perform sentiment analysis and provide a net score and ranking to the interaction.
        Usage of advanced AI models- Transformers and GAN’s to generate natural language
        conversation. This conversation aids the operator seamlessly.
    •   Analyze calls to understand customer and agent negotiation and successful payout.
    •   Building ML pipelines, Test and evaluate the entire workflow in Azure cloud– creation of storage
        space- build models using Azure ML and send signals to the end dashboard.
    •   Monitor and evaluate uplift in revenue, implementing the models built. Forecasting of volumes of
        accounts placed and payments for the entire portfolio.
    •   Time Series models were built using seasonal ARIMA model, prophet, SARIMA, holts-winter,
        Neural networks - RNN-LSTM, GRU model (using PYTORCH, KERAS implementation).
Infosys                                                                           Bangalore, India & USA
Data Science Lead                                                                            07/13 – 08/18
Clients: Toyota Boshoku, Syngenta, CISCO, Microsoft BING, and Spirit.
    • Brainstorm and Detail Use-cases with Client and Sales team.
    • Designed and implemented the architecture being built in conjunction with multiple ERP
        platforms. Data sources from various systems of ERP are integrated to generate the required set
        of data for the analysis. Various SaaS applications are developed to help the business team to
        make organized decisions.
    • Customer data is collected for the past 2-years and IV method is performed initially. Logistic
        regression is applied onto the trained-data considering key variables and validated with test-data.
        Customer Segmentation and profiling was also performed using K-means.
    • Topic Modelling and Problem management analysis was performed on Incidents data. keywords
        from the content data are extracted and aligned based on their frequency and context. Extraction
        of customer centric info for repetitive incidents and high priority incidents are studied and pro-
        active steps are suggested to improve Latency. LDA and NLP techniques were used.
    • Prediction on number of Incidents across the SAP systems; their trend and seasonality
        components are studied w.r.t time. Time series Modelling is performed with different models. For
        each of the configurations- approximate incident count predicted is shared with the resource
        allocation manager to plan for contingencies.
    • Delivered Insights and Predictive model for Pricing data to determine- whether a customer would
        cancel/create orders.
    • Responsible for extracting and loading historical data from source to SQL environment for
        performing various business operations using Data Integration tool, performed data analysis to
        create reporting requirements by specifying inclusion & exclusion criteria, conditions, business
        rules and data elements to be included into the report.
    • Created and presented executive dashboards to show the patterns & trends in the data using
        Tableau.
    • Developed Trend lines, Funnel charts, Donut charts, Heat Maps, Tree Maps and Drilldown
        reports, 100% stacked bar charts etc. in Tableau.
    • Designing queries, compiling of data, and generation of reports in Excel and Tableau.
    • Responsible for development of the KPI’s by scheduling standard procedures to be executed.
    • Responsible for creating and managing various database objects such as tables, stored
        procedures, views, Triggers, and Database Constraints to meet business requirements.
    • Used Tableau to visualize data from a given dataset. Used R to do statistical modeling and did
        data transformation before using the data in Tableau and visualizing it. Wrote R scripts and
        connected with Tableau using an external ODBC in Tableau.
    •   Good experience in publishing various kinds of interactive dashboards, and workbooks from
        Tableau Desktop to Tableau Servers.
    •   Implemented customized and ad-hoc requirements on Integration of Data with Analytics platform
        and delivering visualization on Tableau.
    •   Model building is done considering all the measures and grouping them into relevant group
        measures.
    •   Dimensionality Reduction techniques- PCA, Factor Analysis were used. Regression and Latent
        variable analysis are performed on the data to get the group Measure score. Structural Equation
        Modelling was performed to group measure scores -aggregated with optimized weightage to get
        a final score.
    •   These final scores are validated with benchmarks for reporting and applied K-means to rank into
        five groups. Ranks: 1-5 are attributed to each of the hospitals accordingly.
MOOG, Bangalore, India & USA
Senior Analyst                                                                                11/10 – 06/13
MITC is a leading manufacturer and supplier of Aircraft Actuators and servo motors.
Life –cycle tests are simulated in lab for the entire load cycle experienced by the actuator throughout the
designed life at different environment conditions.
CYIENT                                                                            Hyderabad, India & USA
Senior Design Engineer                                                                         05/06 – 11/10
Cyient is a leading service industry for Aerospace clients. It is a secondary supplier Airbus and Boeing
clients.
     • Simulation of various parts and bearing components were meshed using HYPERMESH and
         stress analysis is performed in ANSYS to extract Maximum and Minimum principal stress and
         determine Factor of Safety.
AWARDS:
    •   Capstone Excellence-Digital Trendsetter using Celonis process Mining.
    •   Outstanding Team Award
    •   Excellence in supporting Business Delivery -2019
    •   Team Award- 2016
    •   Best performer Award