ZIEERS Systems
Bangalore, India
2022-Till Date
ZIEERS Systems is a 2022 startup working on domain specific Large Language Models and MLLM’s for Law, Finance and Medical Diagnostics.
They also have a Startup Incubation wing where they encourage and support young entrepreneur to build the prototypes for their ideas
Law LLM
This project is about Fine-tuning a pretrained Large Language Model (LLM) for lawyers in India. The goal is to adapt a general-purpose LLM to understand Indian legal terminology, case law, statutes, court decisions, and the unique style of legal writing in India.
Medical Diagnostics LLM
Finance LLM
CLS Bank
London, UK
2021
CLS Bank, or Continuous Linked Settlement (CLS) Bank, is a specialized financial institution that operates a global, multi-currency cash settlement system for foreign exchange (FX) transactions.
Its main purpose is to eliminate settlement risk by synchronizing the exchange of currencies in a “payment versus payment” (PvP) system. This means both sides of an FX trade are settled at the same time, ensuring that neither party can send a currency without receiving the other
Intraday Liquidity Shortfall Prediction
Developed a predictive analytics model to forecast intraday liquidity shortfalls across multiple currencies and settlement windows. Combined historical payment flows, participant funding behavior, and real-time transaction patterns using machine learning models (XGBoost, LSTM). Enabled proactive funding decisions, reduced settlement delays, and improved liquidity utilization for CLS participants.
2. Anomaly Detection in Settlement Patterns
Built an AI-driven anomaly detection system to identify unusual or risky settlement behavior in real time. Leveraged unsupervised ML techniques (Isolation Forest, Autoencoders) to analyze multi-currency transaction flows, payment timings, and participant profiles. Enhanced operational risk management, enabling early detection of irregular trades, potential system issues, and settlement risk
Settlement Queue Optimization
Designed and implemented an optimization solution to improve the sequencing of queued FX settlement instructions. Applied constraint-based optimization and reinforcement learning to minimize liquidity usage while maximizing settlement throughput. Resulted in faster settlement cycles, reduced queue congestion, and improved overall PvP settlement efficiency for CLS
Exyte
Stutgart, Germany
2019-2020
Exyte is a global leader in high-tech facility design and construction (semiconductors, biopharma, data centers, EV batteries, etc.).
Construction Delay Prediction
Developed an ML-based predictive system to forecast construction delays across large-scale high-tech facility projects. Integrated schedule data, resource allocation logs, contractor performance, procurement timelines, and environmental factors. Built models using Random Forest, Gradient Boosting (XGBoost), and Time-Series Regression (Prophet/LSTM) to identify high-risk tasks and early triggers of schedule slippage. Enabled proactive mitigation planning and improved on-time project delivery.
✅ 2. Safety Monitoring (Computer Vision + Analytics)
Implemented a real-time computer vision system to detect safety violations on construction sites, including missing PPE, unsafe proximity to machinery, and hazardous behavior patterns. Used YOLOv8/Detectron2 for object detection, Pose Estimation (OpenPose/MediaPipe) for behavior analysis, and anomaly detection models for unsafe activity prediction. Delivered automated alerts and analytics dashboards that reduced on-site safety incidents and improved compliance.
✅ 3. BIM Clash Prediction & Auto-Resolution Suggestion
Built an AI system that predicts potential design clashes in BIM (Building Information Modeling) before they occur and recommends optimal resolution strategies. Leveraged Graph Neural Networks (GNNs) and 3D-Spatial CNNs to analyze BIM components, geometry conflicts, and inter-discipline dependencies. Implemented a suggestion engine using Reinforcement Learning to propose structural adjustments, routing alternatives, and design modifications. Improved BIM coordination efficiency and reduced manual clash detection time.
✅ 4. Cleanroom / Thermal Optimization
Developed an ML-driven optimization framework to improve cleanroom performance, airflow efficiency, and thermal stability. Collected sensor data from HVAC systems, particle counters, pressure differentials, and thermal maps. Applied Regression Models (XGBoost/LightGBM), Neural Networks, and Bayesian Optimization to predict thermal deviations and optimize airflow patterns. Reduced energy consumption, improved environmental consistency, and ensured compliance with ISO cleanroom standards.
Technicolor
Paris, France
2018-2019
Technicolor is a global leader in film production, post-production, VFX, animation, and video/home entertainment technologies.
1. AI for Rotoscoping / Object Masking
Developed an AI-powered rotoscoping and object-masking pipeline to automate frame-by-frame segmentation for film and VFX production. Implemented deep learning models including Mask R-CNN, U-Net, and Transformer-based segmentation networks (SegFormer) to generate high-accuracy object boundaries under complex motion and lighting conditions. Reduced manual rotoscope effort by 60–70% and accelerated post-production workflows.
2. Predictive CDN Caching for High-Demand Titles
Designed a predictive analytics model to forecast content demand spikes and intelligently pre-cache high-traffic titles across global CDNs. Used Time-Series Forecasting (Prophet, LSTM), Gradient Boosting (XGBoost), and user-behavior clustering to predict viewership patterns, regional load distribution, and peak access times. Improved content availability, reduced latency, and optimized CDN resource consumption.
3. Automated Subtitle & Localization (NLP + Speech-to-Text)
Built an end-to-end AI system to automate subtitle generation, translation, and localization for international releases. Integrated ASR models (Whisper, DeepSpeech) for speech-to-text, NLP transformer models (BERT, mBART, T5) for machine translation, and language quality scoring models for accuracy validation. Enabled rapid multi-language subtitle creation and reduced manual localization time by over 50%.
4. Adaptive Streaming Optimization (QoE Enhancement)
Developed an ML-driven adaptive bitrate (ABR) optimization system to improve viewer Quality of Experience (QoE) during streaming. Leveraged Reinforcement Learning (PPO/DQN) and predictive models (bandwidth prediction using LSTM/GRU) to dynamically select optimal video bitrates based on network conditions, buffer status, and user behavior. Resulted in smoother playback, reduced rebuffering events, and improved streaming efficiency across devices.
WPP
London, UK
2016-2017
WPP is a British multinational communications, advertising, and public relations company headquartered in London. It is a global leader in marketing services, providing integrated communications, experience, commerce, and technology solutions through its network of agencies operating in 110 countries.
1. AI-Driven Media Buying Optimization
Developed an AI system to optimize digital media buying decisions across channels by predicting the best budget allocation, bid strategies, and ad placements. Used Reinforcement Learning (DQN/PPO), Marketing Mix Modeling (Bayesian Regression), and Predictive Modeling (XGBoost, LightGBM) to drive real-time optimization. Improved ad efficiency, increased conversions, and reduced overall acquisition cost.
2. Ad Creative Performance Prediction
Built a predictive engine that evaluates and forecasts ad creative performance using visual, textual, and engagement features. Applied Computer Vision models (ResNet, EfficientNet), NLP models (BERT, DistilBERT), and ensemble regression models to score creatives before launch. Enabled data-driven creative selection and significantly improved campaign effectiveness.
3. Cross-Channel Attribution Modelling
Designed a probabilistic attribution model to determine the impact of each channel (search, social, video, display, OTT) on conversions. Utilized Markov Chain Attribution, Shapley Value models, and Hierarchical Bayesian Modeling to assign accurate credit to touchpoints. Provided transparent attribution insights and improved multi-channel budget allocation decisions.
4. Churn Prediction for Advertising Clients
Built a churn detection system to identify clients at risk of reducing or discontinuing ad spend. Leveraged Classification models (Random Forest, XGBoost, Logistic Regression) and Customer Behavior Clustering (K-Means) to analyze spend trends, performance metrics, and engagement patterns. Enabled proactive retention efforts and increased client lifetime value.
5. Campaign ROI Forecasting Engine
Developed a forecasting engine to predict campaign ROI based on historical performance, seasonality, creative quality, and audience attributes. Implemented Time-Series models (Prophet, LSTM), Regression models (CatBoost, Gradient Boosting), and Causal Impact modeling to estimate true campaign returns. Improved budgeting accuracy and future performance planning for marketing teams.
6. AI-Enhanced Media Strategy Planning
Created an AI-powered strategic planning tool that recommends optimal channel mix, audience targeting, and creative strategy for upcoming campaigns. Used Knowledge Graphs, Meta-Learning, Semantic NLP, and Optimization algorithms to generate actionable media plans based on historical outcomes and competitive insights. Enabled faster and more data-driven campaign planning at scale.
RBS
London, UK
2015
RBS is the Royal Bank of Scotland, a British multinational banking and financial services company that is now part of the NatWest Group. Founded in 1727, it is headquartered in Edinburgh and provides a wide range of services for individuals, businesses, and institutions globally. It operates under different brands and has undergone significant restructuring, including its separation from the NatWest Markets business in 2018 to comply with UK ring-fencing rules
Cash Demand Forecasting for Branches & ATMs
Developed a predictive forecasting system to estimate daily/weekly cash demand for bank branches and ATMs across regions. Integrated historical withdrawal patterns, seasonality, salary cycles, holidays, and local demographic indicators. Used Time-Series models (ARIMA, SARIMA, Prophet) and deep learning models (LSTM/GRU) to accurately forecast short-term cash requirements. Reduced stock-outs and overstocking, optimizing cash logistics and lowering operational costs.
2. Staffing Optimization for Branches / Support Teams
Built a data-driven workforce planning model to optimize staff allocation across branches and back-office service teams. Analyzed transaction loads, service time distributions, customer footfall patterns, and peak traffic windows. Applied Demand Forecasting (Random Forest, Gradient Boosting) combined with Optimization Models (Linear Programming, Integer Programming) to recommend optimal staffing levels. Resulted in reduced wait times, balanced workloads, and improved operational efficiency.
3. Market Risk Models (Value-at-Risk Enhancement)
Enhanced the bank’s Market Risk VaR framework by integrating advanced statistical and machine learning models to capture tail risk, volatility clustering, and non-linear market movements. Utilized GARCH models, Monte Carlo Simulation, Historical Simulation, and ML techniques like Random Forest Regression and Extreme Gradient Boosting to improve VaR accuracy. Strengthened risk monitoring, reduced model error, and improved regulatory compliance for trading books.
FedEx
Memphis, USA
2014
FedEx is a multinational company specializing in transportation, e-commerce, and business services, known for pioneering overnight delivery. It was founded in 1971 by Frederick W. Smith, has an annual revenue of approximately $87.7 billion as of FY24, and employs over 500,000 people. The company’s extensive global network includes over 200,000 vehicles and 698 aircraft, serving more than 220 countries and territories
Smart Route Optimization (Real-Time)
Developed a real-time dynamic route optimization engine to minimize delivery time, fuel costs, and vehicle idle time across FedEx’s ground network. Leveraged Reinforcement Learning (DQN/PPO), Graph Optimization Algorithms (A, Dijkstra, VRP solvers)*, and Predictive ETA models (Gradient Boosting, LSTM). Enabled live rerouting based on traffic, weather, and last-minute pickups, significantly improving last-mile delivery efficiency.
✅ 2. Shipment Volume Forecasting
Built a multi-horizon forecasting model to predict shipment volumes at national, regional, and local hub levels. Integrated historical shipment counts, e-commerce trends, holidays, macroeconomic indicators, and promotional events. Implemented Time-Series models (Prophet, SARIMA) and Deep Learning models (LSTM, Temporal Fusion Transformers). Improved capacity planning accuracy and reduced operational bottlenecks during peak seasons.
✅ 3. Hub Load & Resource Optimization
Designed an optimization solution to allocate manpower, sorting equipment, and dock resources inside major FedEx hubs. Used volume forecasts to drive an Integer Programming / Linear Programming optimization engine augmented with ML predictions from Random Forest and XGBoost. Improved throughput, reduced overtime costs, and eliminated surges in package backlogs during high-traffic windows.
✅ 4. Warehouse Robotics Intelligence
Developed an ML intelligence layer to enhance warehouse robotics tasks such as picking, sorting, object recognition, and navigation. Applied Computer Vision (YOLOv8, EfficientDet) for package detection, Reinforcement Learning for autonomous robot path planning, and Sensor Fusion + SLAM for navigation accuracy. Increased sorting speed, reduced manual intervention, and improved warehouse automation reliability.
Pricing Optimization
Implemented an AI-driven pricing engine to optimize shipping rates across service types, zones, weights, and customer segments. Used Demand Elasticity Modeling, Gradient Boosting models (XGBoost, LightGBM), and Revenue Optimization algorithms to recommend price adjustments that maximize profitability while maintaining competitive positioning. Enabled dynamic discounting and improved overall revenue per shipment.
Cemex
Monterrey, Mexico
2013
Cemex is a global building materials company headquartered in Monterrey, Mexico, founded in 1906. It is one of the world’s largest cement producers and offers a wide range of products, including cement, ready-mix concrete, aggregates, and urbanization solutions. The company operates in nearly 100 countries across the Americas, Europe, Africa, the Middle East, and Asia, and focuses on providing innovative and sustainable solutions for the construction industry
Real-Time Fleet Dispatch Optimization
Developed an AI-driven dispatch optimization engine that assigns vehicles, routes, and schedules in real time for large logistics operations. Integrated traffic, material availability, driver shift constraints, and job priority to optimize fleet movement. Implemented Reinforcement Learning (DQN, PPO), Graph Optimization (VRP, A)*, and Predictive ETA models (XGBoost, LSTM). Improved fleet utilization and reduced idle time and delivery delays.
Dynamic Pricing for Logistics & Materials
Built a dynamic pricing model to optimize pricing for logistics services, bulk materials, and rentals based on demand, supply restrictions, seasonality, and competitor movements. Leveraged Gradient Boosting Models (XGBoost, LightGBM), Elasticity Estimation, and Bayesian Optimization to recommend optimal price points. Increased revenue and improved margin stability under fluctuating market conditions.
Customer Churn Prediction for Contractors & Builders
Designed an ML system to identify contractors/builders likely to reduce or stop purchases, enabling proactive retention strategies. Used transaction history, engagement patterns, project cycles, and regional market signals as inputs. Applied Classification models (Random Forest, Logistic Regression, CatBoost) and Explainable AI (SHAP) for feature insights. Helped improve customer retention and sales team targeting accuracy.
Demand-Signal Mining from Urban & Infrastructure Data
Created a predictive intelligence platform that forecasts future demand for construction materials by analyzing urban expansion, permit filings, infrastructure projects, satellite imagery, and macroeconomic indicators. Employed NLP (BERT) for document mining, Time-Series Forecasting (TFT, LSTM), and Computer Vision (ResNet, YOLO) to extract demand signals. Enabled better production planning and inventory management.
EXL
Noida, India
2012
EXL is a global data and AI company that provides analytics and digital solutions to help businesses improve operations and transform their models. Founded in 1999 and headquartered in New York, it serves industries like insurance, healthcare, and finance with services including data analytics, AI, business consulting, and digital operations. The company’s solutions aim to drive better outcomes, unlock growth, and enhance efficiency through a combination of data, AI, and industry expertise
Workforce Scheduling Optimization & Auto Ticket Categorization
Designed an AI-driven operations engine to automate workforce scheduling and optimize resource allocation across support teams. Integrated historical workloads, SLAs, agent skill matrices, ticket arrival patterns, and seasonality to create optimal shift plans. Built an automated ticket classification system using NLP models (BERT, RoBERTa) to categorize incoming tickets and apply priority scoring. Used Time-Series Forecasting (Prophet, LSTM) and Optimization Models (Linear/Integer Programming) to improve scheduling accuracy. Resulted in higher SLA adherence, reduced backlog, and more balanced agent workloads.
Work Type Segmentation & Complexity Analysis
Developed a data-driven segmentation framework to classify operational work items into distinct buckets based on complexity, effort, and skill requirements. Analyzed text descriptions, process metadata, and activity logs to cluster similar work types. Used Unsupervised Learning (K-Means, DBSCAN, Hierarchical Clustering) and Complexity Scoring Models (Gradient Boosting, Logistic Regression) to quantify task difficulty. Enabled accurate work routing, better workforce planning, and improved operational transparency.
ML Models:
K-Means, DBSCAN, Hierarchical Clustering, Gradient Boosting (XGBoost/LightGBM)
Claims Triaging & Routing
Built an intelligent claims triaging system that predicts claim complexity, fraud likelihood, documentation completeness, and required specialist teams. Automated claim routing to appropriate adjusters based on skill level, turnaround expectations, and regulatory requirements. Leveraged Classification models (XGBoost, CatBoost, Random Forest) and NLP (BERT) for document extraction to speed up decision-making. Improved processing efficiency, reduced turnaround time, and reduced misrouted claims.
ML Models:
XGBoost, CatBoost, Random Forest, BERT-based NLP models
Recommendation System for Cross-Sell / Upsell
Developed a recommendation engine to identify high-value cross-sell and upsell opportunities for existing customers. Modeled customer transaction behavior, product adoption patterns, demographic attributes, and lifecycle events. Used Collaborative Filtering, Matrix Factorization, Deep Learning Recommenders (Neural Collaborative Filtering), and Propensity Models (Logistic Regression, Gradient Boosting). Increased customer value, improved conversion rates, and supported targeted marketing campaigns.
ML Models:
Collaborative Filtering, Matrix Factorization, Neural CF, Logistic Regression, XGBoost
MillerCoors
Denever, USA
2011
MillerCoors was a U.S. brewing company formed in 2008 as a joint venture between Miller Brewing Company and Coors Brewing Company, but it no longer exists as a separate entity. It was fully integrated into its parent company, Molson Coors, in 2020. MillerCoors was a major producer of popular beer brands like Miller, Coors, and Blue Moon and was headquartered in Chicago
Sales Forecasting for Each Beer Variant & Region
Developed a granular forecasting engine to predict weekly and monthly sales for each beer variant across states, regions, and distributor networks. Incorporated seasonality, weather, events, competitor activity, promotions, and distributor inventory levels. Implemented Time-Series models (SARIMA, Prophet) and deep learning-based forecasting (LSTM, Temporal Fusion Transformers – TFT) to improve planning accuracy. Enabled better production sequencing, reduced stock-outs, and improved distribution planning.
ML Models: SARIMA, Prophet, LSTM, TFT, Gradient Boosting (XGBoost)
Packaging Line Optimization
Built an AI system to optimize packaging line throughput by analyzing equipment telemetry, downtime logs, operator inputs, SKU changeovers, and bottleneck patterns. Applied Predictive Maintenance models, Process Mining, and Supervised Learning (Random Forest, XGBoost) to identify inefficiencies and recommend optimal line settings. Reduced unplanned downtimes, increased OEE (Overall Equipment Effectiveness), and improved packaging throughput.
ML Models: Random Forest, XGBoost, Anomaly Detection, Predictive Maintenance (RUL Models)
Raw Material Demand Forecasting (Hops, Malt, Yeast, Sugar)
Created a material forecasting engine to predict short-term and long-term demand for brewing raw materials. Integrated production schedules, sales forecasts, wastage patterns, supplier lead times, and seasonality. Used Hierarchical Time-Series Forecasting, LSTM, and Ensemble Regression Models to produce SKU-level material demand plans. Reduced overstocking, minimized wastage, and improved procurement efficiency.
ML Models: Hierarchical Forecasting, LSTM, Random Forest Regression, ARIMA/SARIMA
Yield Optimization (Mash Tun → Fermentation → Filtration → Packaging)
Designed an end-to-end yield optimization model across the entire brewing process. Analyzed temperature curves, mash profiles, enzyme activity, fermentation time, filtration losses, and packaging inefficiencies. Built Multivariate Regression, ANN-based Process Models, and Optimization Algorithms to maximize output yield while maintaining quality standards. Enabled consistent production runs and significant cost savings through waste reduction.
ML Models: Multivariate Regression, Artificial Neural Networks (ANN), Gradient Boosting, Bayesian Optimization
Fermentation Process Optimization (High Impact)
Developed an AI-driven control framework to optimize fermentation dynamics—one of the most critical and cost-sensitive brewing stages. Modeled CO₂ evolution, alcohol formation, yeast activity curves, and temperature/stirring patterns. Applied Reinforcement Learning (PPO, Model-Based RL) and Time-Series Predictive Models (LSTM, GRU) to recommend optimal temperature and aeration strategies. Improved batch consistency, reduced fermentation time, and enhanced product quality.
ML Models: Reinforcement Learning (PPO, DDPG), LSTM/GRU, Process Optimization Models, Anomaly Detection
AXA
Paris, France
2010
AXA is a French multinational insurance and asset management company, headquartered in Paris, that provides a wide range of insurance and financial products to individuals and businesses. It operates in the property & casualty, life & savings, and health sectors, and also manages assets for its clients. AXA is one of the world’s largest insurance brands, with a significant global presence across multiple continents
Customer Experience / Retention Analytics
Built an analytics framework to measure customer experience across channels (branch, app, call center) using behavioral signals, service interactions, complaints, and NPS trends. Developed models to identify key drivers impacting customer satisfaction and retention. Delivered insights to CX and product teams to redesign journeys, reduce friction points, and improve service personalization.
ML Models: Regression Models, Gradient Boosting (XGBoost/LightGBM), NLP (BERT) for feedback analytics, Clustering Models for CX pattern mining.
Customer Churn Prediction
Developed a predictive model to identify retail and SME customers most likely to churn based on transaction trends, product usage, digital engagement, life events, and sentiment extracted from support interactions. Enabled proactive retention campaigns and improved relationship management.
ML Models: Logistic Regression, Random Forest, XGBoost, CatBoost, SHAP for explainability.
Recommendation System for Cross-Sell / Upsell
Built a personalized recommendation engine to suggest banking products (credit cards, savings accounts, loans, insurance) based on customer transaction behavior, lifecycle stage, financial goals, and propensity models. Improved product adoption rates while reducing unnecessary marketing outreach.
ML Models: Collaborative Filtering, Matrix Factorization, Neural Collaborative Filtering (NCF), Gradient Boosting Propensity Models.
Customer Segmentation for Personalized Marketing
Created a segmentation framework to classify customers into behavioral and financial clusters for targeted campaigns. Used demographic data, digital engagement metrics, financial health scores, and spending patterns to design high-impact persona groups. Supported hyper-personalized marketing and reduced acquisition costs.
ML Models: K-Means, Hierarchical Clustering, DBSCAN, PCA for dimensionality reduction.
Agent Fraud Detection
Implemented a fraud detection system to identify suspicious activities among internal agents, including anomalous account actions, unauthorized transaction patterns, and unusual login/access behaviors. Combined rule-based engines with anomaly detection and supervised ML to reduce internal fraud risk.
ML Models: Isolation Forest, Autoencoders, Random Forest Classifier, XGBoost, Behavioral Anomaly Models.
NAB
Melbourn, Australia
2009
National Australia Bank (NAB) is a major Australian financial services company offering a wide range of products including personal and business banking, wealth management, and financial advice. It was formed in 1982 and operates across Australia, New Zealand, the UK, and Asia, serving individual consumers, small and large businesses, and institutions
Automated Loan Underwriting
Developed an AI-driven underwriting engine to automate creditworthiness assessment for personal, business, and home loans. Integrated customer financial data, income stability, debt ratios, bureau reports, and alternative data sources to accelerate decisioning. Delivered risk-adjusted approval recommendations with explainability for compliance and audit requirements.
ML Models: Gradient Boosting (XGBoost/LightGBM), Logistic Regression, Explainable AI (SHAP/LIME), Ensemble Models, Rule-Based Hybrid Systems.
Collateral Valuation & Risk Assessment
Built a predictive system to evaluate collateral value for secured lending (homes, commercial property, equipment). Used market trends, property attributes, comparable sales, depreciation curves, and macroeconomic indicators to generate real-time valuations and risk scores. Improved accuracy of LTV (Loan-to-Value) calculations and reduced manual appraisal effort.
ML Models: Regression Models (ElasticNet, Random Forest Regression), XGBoost, Time-Series Market Forecasting, Computer Vision (ResNet) for property image analysis.
Credit Risk Scoring & Loan Default Prediction
Designed a machine learning framework to assess default probability across consumer and business loan portfolios. Modeled historical repayment behavior, credit utilization, spending patterns, liquidity signals, and macroeconomic factors to generate risk scores. Supported portfolio stress testing and pricing decisions.
ML Models: Logistic Regression, Random Forest, CatBoost, XGBoost, Survival Analysis Models, GBDT Ensembles with SHAP Explainability.
Real-Time Fraud Detection
Implemented a real-time fraud detection engine for card transactions, online banking, and payments. Built models to detect abnormal spending patterns, geographic anomalies, device fingerprint mismatches, and suspicious velocity behaviors. Integrated streaming analytics for sub-second decisioning.
ML Models: Isolation Forest, Autoencoders, Gradient Boosting, Graph-Based Anomaly Detection, LSTM/GRU for sequence behavior modeling.
Fake Document Detection
Developed an AI system to detect forged financial documents (bank statements, payslips, IDs) submitted during loan applications. Used OCR and vision-based models to extract features, verify authenticity, detect tampering, and identify synthetic content. Reduced manual review load and strengthened fraud defenses.
ML Models: OCR (Tesseract/DocTron), CNN Models (ResNet, EfficientNet), Vision Transformers (ViT), Anomaly Detection, NLP Models (BERT) for text consistency checks.
Danske Bank
Copenhagen, Denmark
2008
Danske Bank is a Danish multinational banking and financial services company headquartered in Copenhagen, and it is the largest bank in Denmark and a major retail bank in Northern Europe. It was founded in 1871 and provides a wide range of services including personal and business banking, asset management, and corporate services. The bank has significant operations in its home country, as well as in Sweden, Finland, and Northern Ireland, where it operates as a standalone business unit
Customer Churn Prediction
Developed a predictive churn model to identify retail and SME customers at high risk of leaving based on product usage patterns, transaction behavior, digital engagement, service interactions, and macro-economic signals. Integrated the model into CRM workflows for proactive retention interventions and personalized offers.
ML Models Used:
XGBoost, Random Forest, Logistic Regression, CatBoost, SHAP for explainability.
Quantifiable Outcomes:
Improved churn prediction accuracy by 21%.
Enabled targeted retention campaigns reducing churn by 13% YoY.
Increased customer lifetime value by 8% in high-risk segments.
Personalized Product Recommendation System
Built an AI-driven recommendation engine to provide personalized banking product suggestions (loans, savings, wealth products, insurance). Analyzed spending patterns, life-stage indicators, financial health scores, and digital behavior to generate tailored recommendations.
ML Models Used:
Collaborative Filtering, Matrix Factorization, Neural Collaborative Filtering (NCF), Gradient Boosting Propensity Models.
Quantifiable Outcomes:
Increased cross-sell/upsell conversion by 17%.
Reduced irrelevant marketing communication by 28%.
Boosted average product per customer (PPC) by 0.6 products.
Loan Portfolio Optimization Model
Designed an optimization engine to balance risk, return, and capital allocation across the full lending portfolio (retail, SME, corporate). Modeled default risk, interest rate sensitivity, exposure concentrations, and regulatory capital requirements to derive optimal loan mix and pricing strategies.
ML Models Used:
Regression Models (ElasticNet, Random Forest Regression), XGBoost, Portfolio Optimization (Linear & Quadratic Programming), Survival Models.
Quantifiable Outcomes:
Improved risk-adjusted return on portfolio (RAROC) by 11%.
Reduced high-risk loan exposure by 9%.
Achieved 6% improvement in capital efficiency (RWA optimization).
Early Warning System for NPAs (Non-Performing Assets)
Developed a predictive Early Warning System (EWS) to detect customers and businesses likely to become NPAs in advance by analyzing repayment behavior, cashflow trends, overdraft patterns, credit bureau signals, and macroeconomic indicators. Integrated alerts into risk operations for proactive recovery measures.
ML Models Used:
XGBoost, CatBoost, Logistic Regression, Time-Series Risk Indicators, Anomaly Detection (Isolation Forest).
Quantifiable Outcomes:
Identified potential NPAs 3–6 months earlier.
Reduced NPA formation rate by 14% in pilot portfolios.
Improved recovery efficiency by 18% through early interventions.
ILO
Geneva, Switzerland
2007
The International Labour Organization (ILO) is a specialized agency of the United Nations that promotes social justice and international labor standards. It is a unique “tripartite” agency, meaning it includes representatives from governments, employers, and workers from its 187 member states, providing them an equal voice in shaping policies and programs. Its mission is to set labor standards, promote decent employment, enhance social protection, and strengthen social dialogue
Global Labour Market Intelligence Platform
Designed an AI-powered analytics platform aggregating global labor statistics, job trends, demographic indicators, and economic signals from 150+ countries. Delivered real-time insights on employment, skills shortages, informal sector trends, and vulnerable worker groups to support UN labour policy decisions.
ML Models Used:
Time-Series Forecasting (Prophet, LSTM), Clustering (K-Means, DBSCAN), NLP (BERT) for policy text mining, Gradient Boosting for predictive modelling.
Quantifiable Outcomes:
Reduced research & report preparation time by 40%.
Improved forecast accuracy for labor indicators by 18%.
Enabled 90+ country teams to access unified labour intelligence dashboards.
Child Labour Risk Detection System
Developed a predictive framework to identify regions and industries at high risk of child labour by analyzing poverty levels, school attendance, household vulnerability, supply-chain data, and conflict indicators. Integrated satellite imagery for remote verification.
ML Models Used:
XGBoost, Random Forest, Logistic Regression, Computer Vision (ResNet), Geospatial Models.
Quantifiable Outcomes:
Improved risk hotspot identification by 27%.
Helped prioritize interventions leading to 15% faster deployment of field resources.
Supported 20+ global programs with evidence-based targeting.
Modern Slavery & Forced Labour Detection
Built an ML-based detection system to identify forced labour patterns in supply chains by mining inspection reports, worker grievances, recruitment data, wage records, and online job postings. Used NLP to detect coercion patterns and exploitative contract language.
ML Models Used:
NLP (BERT, RoBERTa), Anomaly Detection (Autoencoders, Isolation Forest), Gradient Boosting (LightGBM).
Quantifiable Outcomes:
Detected 22% more high-risk entities than manual methods.
Reduced manual review workload by 35%.
Increased early intervention accuracy by 19%.
Fair-Wage Compliance Analytics
Built a wage-compliance analytics engine to detect wage theft, unfair pay practices, and non-compliance with minimum wage laws across countries and sectors. Modelled wage distributions, gender pay gaps, overtime anomalies, and sector norms.
ML Models Used:
Regression Models, Anomaly Detection, Statistical Wage Curve Modelling, SHAP Explainability.
Quantifiable Outcomes:
Identified 30% more wage violations versus traditional audits.
Enabled corrective employer actions reducing wage non-compliance by 12%.
Cut compliance analytics time by 50%.
Global Remote Work & Digital Platform Economy Analysis
Developed a global analytical framework to measure the rise of remote work, gig economy participation, and online platform labour. Mined job postings, worker profiles, digital platform data, and cross-border wage patterns.
ML Models Used:
NLP (BERT) for job description mining, Topic Modelling (LDA), Clustering (K-Means), Time-Series Models (ARIMA, LSTM).
Quantifiable Outcomes:
Improved labour trend detection accuracy by 25%.
Provided evidence supporting 5 major ILO policy papers on digital labour rights.
Delivered insights used by 35+ countries for national labour strategy updates.
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.Migration Labour Flow Modelling
Built predictive models to forecast cross-border labour migration based on geopolitical events, demographic pressures, economic shifts, skill shortages, and climate-driven displacement. Created early-warning insights for governments.
ML Models Used:
Time-Series Forecasting, Bayesian Networks, LSTM, Gradient Boosting, Geospatial Flow Models.
Quantifiable Outcomes:
Enhanced migration flow prediction accuracy by 23%.
Helped governments plan labour programs 3–6 months earlier.
Improved policy targeting for migrant protection by 17%.
Trade–Labour Policy Impact Simulation Tool
Developed a simulation engine to estimate the labour-market impact of trade agreements, tariffs, automation, and regulatory changes. Modelled job creation/loss, wage shifts, and sectoral transitions under various policy scenarios.
ML Models Used:
Agent-Based Models, System Dynamics, Econometric Models, Reinforcement Learning for scenario optimization.
Quantifiable Outcomes:
Reduced economic modelling time by 45%.
Increased accuracy of employment impact projections by 20%.
Used by 10+ national governments to shape labour & trade policies.
Accenture
Bangalore, India
2005-2006
Accenture is a global professional services giant specializing in IT, consulting, digital, technology, and operations, helping big companies transform with strategy, cloud, AI, and security, powered by huge talent (nearly 800k people) and tech platforms, focusing on innovation and sustainability for clients across 40+ industries worldwide. They offer Strategy & Consulting, Technology, Operations, and Accenture Song (creative/marketing) services, acting as reinvention partners for digital core building and AI-driven value creation, known for innovation and strong industry leadership
Network Anomaly Detection & Alert Prioritization (High Impact)
Problem
NOC teams receive thousands of alerts (SNMP traps, syslogs, threshold breaches). Most are false alarms or low priority.
ML Solution
Build an ML model that:
- Detects unusual patterns in traffic/device metrics (unsupervised anomaly detection)
- Assigns severity levels to incoming alerts
- Suppresses noise automatically
Techniques
Isolation Forest, Autoencoders, One-Class SVM, LSTM.
Impact
✔ 40–60% alert noise reduction
✔ Faster detection of real issues
✔ Reduced operator fatigue
Network Traffic Forecasting for Capacity Planning
Problem
Capacity upgrades are done reactively.
ML Solution
Forecast:
- Bandwidth utilization
- Link saturation
- Latency trends
- Peak usage periods
Techniques
ARIMA, LSTM, Temporal Fusion Transformer.
Impact
✔ Better capacity planning
✔ Avoid congestion
✔ Save cost by forecasting upgrade needs
GECIS
Hyderabad, India
2002-2004
GECIS (GE Capital International Services) was a pioneer in the business process outsourcing (BPO) and IT services industry in India. It was originally established as a way to lower GE’s internal costs. In 2004-2005, GE partnered with investors General Atlantic and Oak Hill Partners to transform GECIS into an independent entity, which was subsequently rebranded as Genpact (Generating Impact). Genpact is now a major global IT services, consulting, and outsourcing company
Workforce Scheduling Optimization
Optimize roster to reduce idle time and overtime cost
Predict call/chat/email volume and staff requirements by shift
Auto Ticket Categorization & Prioritization
Assign priority & route to correct team
Extract information from service requests via NLP