AVAILABLE FOR WORK

Mrunal Meshram

Building scalable systems, intelligent pipelines, and data-driven solutions across fintech & enterprise environments.

Scroll

01About Me

Mrunal Meshram

I'm a Software & Data Engineer with over 5 years of experience building scalable backend systems, data pipelines, and intelligent solutions across fintech, payments, and enterprise environments.

My work spans the full data lifecycle — from designing ETL/ELT pipelines and optimizing SQL-heavy workloads, to developing REST APIs, integrating AI/ML components, and building interactive BI dashboards that drive business decisions.

I've worked with technologies like Python, Java, SQL, Azure, AWS, Databricks, Power BI, and modern AI frameworks like LangChain and LLMs. I'm passionate about turning complex data into clear, actionable insights and building systems that are reliable, performant, and maintainable.

Currently based in Dublin, Ireland, I hold a Master's in Artificial Intelligence from National College of Ireland and I'm always looking for challenging problems at the intersection of software engineering, data, and AI.

5+Years Experience
6+Companies
8+Projects Delivered
MScArtificial Intelligence

02Skills & Technologies

Languages

Python Java SQL T-SQL Cypher DAX HTML/CSS JavaScript

Data & Databases

SQL Server PostgreSQL Oracle Neo4j Snowflake Redshift SSIS SSAS

Cloud & Infrastructure

Azure Data Factory Azure Synapse AWS S3 AWS Lambda AWS Glue Databricks Docker GCP

Frameworks & Tools

Apache Spark FastAPI Django LangChain REST APIs Git JIRA CI/CD

BI & Visualization

Power BI Tableau Looker SSRS TruBI Excel

AI & Machine Learning

LLMs Deep Learning NLP RAG Pipelines Prompt Engineering FAISS OpenAI Whisper StyleGAN2

Practices & Concepts

ETL/ELT Data Warehousing Data Modeling Data Governance Agile/Scrum AML/KYC RLS

03Professional Experience

Software Engineer

JDIT Software Solutions

Aug 2025 – Present Remote, Ireland
  • Developed backend REST APIs using Python to support health insurance workflows including claims processing and policy data retrieval
  • Built automation features that reduced manual effort in insurance data handling and validation processes
  • Implemented LLM-based components for document processing and query automation within internal tools
  • Integrated third-party and internal APIs for seamless data exchange across insurance systems
  • Worked with LangChain to prototype AI-driven workflows for document understanding and task automation
  • Optimized SQL queries and backend logic to improve response time and system efficiency
  • Followed secure coding practices for handling sensitive insurance and customer data in an agile environment
PythonREST APIsLangChainLLMsSQLAgile

AI Researcher & Developer

National College of Ireland (MSc in AI)

Jan 2024 – Feb 2025 Dublin, Ireland
  • Expanded expertise in AI with a strong focus on real-world applications, scalable solutions, and emerging trends in Generative AI and autonomous agents
  • Designed and developed research-driven projects using Large Language Models (LLMs) and modern AI frameworks
  • Built AI-powered agent workflows and automation systems leveraging LLMs for task orchestration and decision-making
  • Applied machine learning and deep learning techniques to build predictive and intelligent systems
  • Explored and implemented agent-based architectures, including multi-step reasoning and tool integration
  • Presented technical findings and project outcomes to academic and professional audiences
  • Strengthened skills in data handling, model evaluation, prompt engineering, and performance optimization
LLMsLangChainDeep LearningPythonNLPRAG

Data Analyst (Internship)

The Urban Co-op Wellness Hub

Jan 2024 – May 2024 Dublin, Ireland
  • Built Power BI dashboards analyzing 3+ years of operational data, improving demand planning accuracy by ~30%
  • Automated SQL-based data refresh pipelines, reducing manual reporting effort by 25%
  • Presented insights to non-technical stakeholders to inform operational decisions
Power BISQLData AnalysisStakeholder Communication

Software & Data Engineer

FIS (Contract)

Sep 2023 – Jan 2024 Pune, India
  • Designed and developed backend data processing workflows using Python and SQL, improving payment platform throughput by 22%
  • Translated business and merchant requirements into scalable, production-ready data pipelines and services
  • Built and optimized ETL systems processing millions of financial records, reducing manual processing by 60+ hours/month
  • Designed scalable data models supporting BI reporting for risk and compliance teams, increasing reporting accuracy by 22%
  • Modernized data ingestion from APIs and cloud storage into enterprise warehouses, enhancing data freshness and reliability
  • Collaborated with DevSecOps and infrastructure teams to resolve database performance and security issues in high-availability systems
  • Implemented CI/CD and version control best practices, streamlining deployment and maintaining code quality
  • Built interactive Power BI dashboards using DAX for real-time payment anomaly monitoring, reducing false positives by 15%
PythonSQLETLPower BIDAXAzureCI/CDGit

Software & Data Engineer

Fiserv

May 2022 – Aug 2023 Pune, India
  • Designed, developed, and maintained scalable software systems for high-volume payment and financial data processing
  • Built end-to-end data pipelines using Python, SQL, and cloud technologies, improving system efficiency and reducing processing time by up to 40%
  • Engineered data models and storage solutions supporting analytics, reporting, and real-time decision-making
  • Integrated ML models for anomaly detection in payment processing, cutting false-positive fraud flags by 15% and saving $1.2M annually
  • Automated data extraction workflows using Python, saving 20+ manual hours per week
  • Created Power BI dashboards with DAX-driven business logic for merchant settlement reporting, accelerating reconciliation by 14%
  • Designed SQL-based transaction monitoring rules integrating AML and KYC controls for regulatory compliance
  • Optimized SQL queries and indexing strategies, improving transaction monitoring and reporting performance by 30%+
  • Collaborated with cross-functional teams (DevSecOps, BI, Product) to design system architecture and align technical solutions with business goals
PythonJavaSQL ServerAWSDatabricksPower BISSISTableauDocker

Data & BI Engineer

Datamatics

Oct 2019 – May 2022 Mumbai, India
  • Built and maintained ETL pipelines and SQL transformations supporting enterprise reporting and analytics systems
  • Managed migration of on-prem legacy data to cloud architecture by building robust Python ETL pipelines
  • Designed automated Power BI dashboards simplifying complex technical data for regional management teams
  • Developed Python scripts for data extraction, validation, and API-based ingestion, enhancing pipeline consistency
  • Optimized SQL workloads through query refactoring, indexing, and partitioning, improving performance by 25-35%
  • Ran quantitative analysis on large-scale datasets to build forecasting models for 3-year strategic business plans
  • Delivered 24/7 production support for ETL jobs, reducing recurring failures through preventive controls and monitoring
  • Reduced query latency by 40% through advanced indexing and SQL optimization for enterprise applications
SQL ServerPythonSSISPower BIAzureETLT-SQL

04Education

2024 – 2025

Master's in Artificial Intelligence

National College of Ireland

Dublin, Ireland
2019

PG Diploma in Advanced Computing

Centre for Development of Advanced Computing (CDAC)

Grade-A Institute AIT YCP
2014 – 2017

Bachelor's in Electronics Engineering

MIT Academy of Engineering

Pune, India

05Projects

AI / ML

Neo4j-Powered Citation Network Link Prediction

Built a GDS link-prediction pipeline leveraging graph algorithms (common neighbors, preferential attachment, triangle counts) achieving 82.2% speed-up. Trained Random Forest model natively (AUCPR 0.826 training, 0.801 testing). Hybrid Neo4j-Python workflow with 87.3% cache hit rate and 3.2x throughput improvement.

Neo4jCypherPythonPy2neoRandom ForestGDS
AI / ML

Gen-AI Hybrid Skin Disease Detection

Developed a hybrid ML pipeline combining real and StyleGAN2-generated synthetic skin disease images. Fine-tuned EfficientNet models on enriched datasets, boosting accuracy and robustness. Evaluated using FID, LPIPS, ROC-AUC, Precision, Recall, and F1-Score metrics.

StyleGAN2EfficientNetPythonDeep LearningComputer Vision
AI / ML

Audio-Based Chatbot with Real-Time Assistance

Real-time voice-interactive chatbot using OpenAI Whisper (STT) and Coqui TTS with latency under 2 seconds. FastAPI microservices cutting latency by 45%. RAG pipeline combining FAISS and PostgreSQL improving relevance by 30%. Fine-tuned Llama model. Deployed via Django + Docker.

FastAPIWhisperFAISSRAGLlamaDockerDjangoPostgreSQL
AI / ML

Fine-Tuned NLP Model for College QA

Fine-tuned Llama model on college-specific datasets for education domain queries achieving 85% accuracy. Custom dataset with augmentation techniques. Deployed on GCP with GPU instances supporting 100 concurrent users. 20% improvement over baseline models.

LlamaNLPGCPFine-tuningPython
Data Engineering

ICRA — Financial Data Platform

Created database objects (tables, stored procedures, views, triggers, functions) with performance tuning. Developed T-SQL pivot/unpivot queries for efficient report analysis. Built SSIS packages for data load and TruBI dashboards.

SQLT-SQLSSISTruBIPerformance Tuning
Data Engineering

Vistra — Survey Data Platform

Created stored procedures with business logic. Built SSIS packages for base and survey data load. Implemented TruBI reports. Conducted unit testing and UAT for data integration validation across source systems.

SQLT-SQLSSISTruBIUAT
Data Engineering

Nielsen Brand Track — Analytics Pipeline

Extensive SSIS work importing, exporting, and transforming data between linked servers. Created database objects using T-SQL in development and production environments. Survey data load utilities and analytics reports via TruBI and SSAS.

SQLSSISSSAST-SQLTruBI
Full-Stack

Online Auction System (CDAC)

Developed a user-friendly online auctioning platform using .NET with product authentication, secure bidding server, and highly-scalable architecture supporting large numbers of concurrent bidders in active auctions.

.NETSQL ServerWeb DevelopmentScalable Architecture

06Get In Touch

I'm currently open to new opportunities and always happy to connect. Whether you have a role in mind, a project to collaborate on, or just want to say hello — feel free to reach out!