My Work Experience

I’ve worked as a data engineer for a number of multinational corporations in a variety of industries, including banking, insurance, healthcare, finance, utilities, and ecommerce.

Senior Data Engineer | Exact Sciences | Madison, Wisconsin, United States | March, 2023 - Present

    Leveraging technical expertise to drive data analysis and master data management, with a focus on delivering insightful visualizations and performance-optimized reports. Core responsibilities include designing and automating complex data pipelines, ensuring high data quality, and enhancing monitoring and automated testing processes. Skilled in collaborating with analysts and cross-functional teams to develop innovative, scalable solutions that meet evolving business needs. Act as a key resource for insights, recommendations, and support across both technical and non-technical domains. Proactively monitor and recommend architectural improvements, stay current on industry trends, and work closely with IT teams to establish and uphold best practices, standards, and documentation.

Skills: Python .  AWS Cloud . Advanced SQL . CI/CD Pipelines . Git Version Control . Databricks . Snowflake . Electronic Health Record (EPIC) .  Online Transaction Processing (OLTP) · Critical Thinking · RDBMS · Data Architecture · Data Science · Data Modeling · OLAP · Big Data · Data Engineering ·  Technical Leadership · Business Intelligence · Requirements Gathering · Data Warehousing · Performance Tuning · Extract, Transform, Load (ETL) · Agile Methodologies · Databases · Spark 

Senior Data Engineer | Amazon Web Services (AWS) | Madison, Wisconsin, United States | February, 2022 - March, 2023

    Working on the design and building of data pipelines that track AWS’s progress toward Net Zero Carbon and 100% renewable goals, primarily by monitoring the electricity and water consumption of all AWS’s data centers around the globe. Implementation of data pipelines that supports various analytics around renewable energy, water consumption, environmental sustainability, and energy consumption. Collaborate with multiple teams and product owners to gather requirements for data management, including storing, transforming, and loading data, as well as building dashboards for various analytics use cases. Design and develop technical solutions for extremely complex data analytics problems in collaboration with internal clients. Responsible for the design, development, and upkeep of ongoing metrics, reports, analysis, and dashboards to drive key business decisions. Using SQL and AWS big data technologies, interact with other teams to extract, transform, and load data from a wide variety of data sources. Collaborate with other data engineers, data scientists, and business intelligence engineers to build robust data architecture and pipelines.

Skills: Data Governance · Git · Communication · Online Transaction Processing (OLTP) · Critical Thinking · RDBMS · Data Architecture · Cloud Computing · Data Science · Computer Science · Data Modeling · OLAP · Big Data · Data Engineering · PL/SQL · Python · Technical Leadership · Business Intelligence · Requirements Gathering · Data Warehousing · Performance Tuning · Extract, Transform, Load (ETL) · Agile Methodologies · Databases · Microsoft Office · Tableau · Apache Spark · Metadata Management · Gitlab· EC2 Instances · ECS (AWS Identity and Access Management, Secrets Manager, Amazon CloudWatch)· Amazon Redshift · Athena · S3 (DataLake implementation, backups, snapshots) · AWS Lambda · Amazon Glue · AWS VPC · Amazon SNS

Senior Data Engineer | Alliant Energy | Madison, Wisconsin, United States | November, 2018 - February, 2022

    Translates business requirements into architecture for various business intelligence applications and converts them into high level and low-level designs. Builds conceptual, logical, and physical data models using Erwin which adopts Kimball frameworks using 3NF and de-normalized dimensional design philosophies. Creates technical design document which is then utilized during the development course of various ETL projects. Creates source to target mapping for all the ETL projects. Comes up with the estimates for data warehousing projects and communicates the work effort to EDW management and stakeholders. Interprets and analyzes data from various source systems to support data integration and data reporting needs. Solves complex integration design challenges. Builds informatica mappings as per the technical requirements and design. Performs data profiling, data quality check and builds data relationships. Implements best practices and standards for data management/conversion project. Performance tuning of PowerCenter mapping if there is any bottleneck or performance issues in the ETL processes. Develops and implements ETL data pipelines using Python. Responsible to research and troubleshoot, using tracing techniques in both SQL & Informatica. Creates and maintain tables, views, stored procedures, triggers, and functions in oracle database to meet specific technical design. Develops highly performant structures such as staging areas, integrated data, data marts, and operational data stores. Recommend ways to improve data reliability, efficiency, and quality. Ensure systems meet industry best practices and business, security, privacy, and retention requirements. Create test plans and testing strategy for each of the business functional and nonfunctional requirements. Scheduling of ETL jobs using Tidal. Data migration experience from legacy on-premises platform into snowflake cloud warehouse and informatica cloud IICS. Provides support to business users and customers during the end-to-end implementation of a data pipeline solution. Performance tuning of Hive/beeline scripts as well as oracle SQLs.

Skills: Data Governance · Git · Communication · Online Transaction Processing (OLTP) · Critical Thinking · RDBMS · Data Architecture · Cloud Computing · Snowflake · Data Science · Computer Science · Data Modeling · OLAP · Big Data · Data Engineering · PL/SQL · Python · Technical Leadership · Business Intelligence · Requirements Gathering · Data Warehousing · Performance Tuning · Extract, Transform, Load (ETL) · Agile Methodologies · Informatica · Databases · Oracle · Hadoop · Microsoft Office · Microsoft SQL Server · JIRA · Unix Shell Scripting · Apache Spark · Metadata Management · Gitlab

Technical Lead | Gallagher Basset | Chicago, Illinois, United States | April, 2018 – December, 2018

    Lead a technical data team of 5-10 developers throughout the full development cycle of a solution which is design, development, review, testing, implementation, and Support. Provide technical direction to the management and work closely with business partners to make sure that technical team delivers scalable and maintainable business intelligence solutions which supports multiple client’s needs. Build conceptual, logical, and physical data models using the Kimball methodology to build enterprise data warehouse which can support various Analytical as well as ad hoc and scheduled reporting requirements. Perform impact assessment and technical and operational feasibility during requirement gathering phase. Work closely with management to finalize the work estimate and road map for the projects in pipeline. Design and develop ETL layer from scratch to support new initiatives and reporting requirements. Support existing processes running in production. Perform impact analysis, performance tuning, and capacity planning for enterprise data warehouse and infrastructure as source systems are added and new integration business rules and logic are introduced. Implement standards and best practices around the ETL and reporting processes within the existing enterprise data warehouse. Identify and communicate data driven insights into the business and management. Provide expertise in SQL query writing and data management. Ensure solutions adhere to enterprise standards and align with application or technology stack. Mentor and collaborate with team members to achieve high performing results. Utilize Power designer 16.6, Visual Studio 2015, SQL Server Data Tools 2012, SQL Server Management Studio, TFS, Power BI, SQL Server Parallel Data Warehouse (PDW)

Skills: Data Governance · Communication · Online Transaction Processing (OLTP) · Critical Thinking · RDBMS · Data Architecture · Cloud Computing · Computer Science · Data Modeling · OLAP · Big Data · Data Engineering · PL/SQL · Python · Technical Leadership · Business Intelligence · Requirements Gathering · Data Warehousing · Performance Tuning · Extract, Transform, Load (ETL) · Agile Methodologies · SSIS · Databases · Oracle · Microsoft Office · Microsoft SQL Server · Tableau

Senior Data Engineer | Bank of America | Charlotte, North Carolina, United States | June, 2017 – April, 2018

    Translate business requirements into architecture for solution and convert designs into technical solution. Select most appropriate technology and perform impact assessment and technical / operational feasibility. Interface with engineers, product managers, and product analysts to understand data needs. Design and develop ETL layer to support initiatives and reporting requirements. Design, build and launch new data extraction, transformation, and loading processes in production. Support existing processes running in production. Work in Agile projects and translate backlog items into engineering design and logical units of work, providing daily work status in the scrum. Perform impact analysis, performance tuning, and capacity planning for enterprise data warehouse and infrastructure as source systems are added and new integration business rules and logic are introduced. Manage and communicate data warehouse plans. Develop high-level strategies to solve business problems. Identify and communicate data driven insights. Provide expertise in SQL query writing and data management. Ensure solutions adhere to enterprise standards and align with application or technology stack. Prepare estimations and schedule for business intelligence projects. Mentor and collaborate with team members to achieve high performing results. Utilize Q2DM microservices, Microsoft BIDS 2012, Microsoft SSIS/SSRS/SSAS/SSMS 2012, Tableau, NDM, SFTP, JIRA, RALLY, Visual C#, Unix, Teradata, Oracle, SVN, Autosys, ER/Studio, Hive, Python, Talend

  • Designed and implemented audit strategy from scratch by parsing trace log files.
  • Set up new analytical server for all reporting and analytical purposes using server-to-server transactional replication.
  • Troubleshot various complex performance issues utilizing a variety of tools and techniques and optimized ETL jobs.

Skills: Communication · Online Transaction Processing (OLTP) · Critical Thinking · RDBMS · Data Architecture · Computer Science · Data Modeling · OLAP · Big Data · Data Engineering · PL/SQL · Python · Technical Leadership · Business Intelligence · Requirements Gathering · Data Warehousing · Performance Tuning · Extract, Transform, Load (ETL) · Agile Methodologies · Informatica · SSIS · Databases · Oracle · Hadoop · Microsoft Office · Microsoft SQL Server · Unix Shell Scripting

Technology Lead | American Family Insurance | Madison, Wisconsin, United States | March, 2015 - June, 2017

    Gathered and understood requirements from business partners and converted them into low-level design and technical specifications. Partnered with product and engineering teams to solve problems and identify trends and opportunities. Compiled data to analyze, design, develop, troubleshoot, and implement business intelligence applications using various technologies and databases. Created unit test plans, system test plans, and integrated test plans and conducted testing in different environments for various business intelligence-based applications. Facilitated and led reviews and walkthroughs of technical specifications and program code with technical team members. Prepared estimations and schedule for business intelligence projects. Worked closely with vendor team to understand behavior of source data for ETL consumption. Implemented ETL solutions and provided warranty support to existing active applications. Managed offshore team by analyzing and sharing work with developers. Reviewed deliverables and provided updates to project team daily. Created proofs of concept, documentation, and knowledge transfer. Worked closely with release management team, DBAs, enterprise architects, and testing team during design, development, testing, production implementation and project support. Proactively identified risks involved in the project and communicated those to management in a timely manner. Utilized PowerCenter 9.6, Business Objects 4.1, SAS, DB2, Oracle, Greenplum, MS Visio, ER/Studio, Autosys, and Hadoop tools PIG, HIVE, SQOOP, Spark and Python.

  • Designed and developed 100+ medium to complex Informatica mappings that satisfy various business rules.
  • Architected and developed an end-to-end solution that serves multiple analytical needs using various big data and reporting tools.

Skills: Communication · Online Transaction Processing (OLTP) · Critical Thinking · RDBMS · Data Architecture · Computer Science · Data Modeling · OLAP · Big Data · Data Engineering · PL/SQL · Python · Technical Leadership · Business Intelligence · Requirements Gathering · Data Warehousing · Performance Tuning · Extract, Transform, Load (ETL) · Informatica · Databases · Oracle · Hadoop · Microsoft Office · Tableau · Unix Shell Scripting · Apache Spark · EC2 Instances · Amazon Redshift · Amazon S3

Technology Lead | Travelers Insurance | Hartford, Connecticut, United States | July, 2014 - March, 2015

    Translated business requirements into technical requirements and data needs. Designed, coded, tested, debugged, and documented programs and ETL processes. Evaluated user requirements for new or modified functionality and conveyed those requirements to offshore team. Conceptualized designs and prepared blueprints and other documentation. Provided technical assistance to business users and monitored performance of ETL processes. Conducted walkthroughs of designs with architect and support community to obtain signoff. Collaborated with Project Managers to prioritize development activities and handle task allocation. Maintained documents for design reviews, audit reports, ETL technical specifications, unit test plans, migrations checklists, and schedule plans. Utilized SQL Server, SSIS, VB Scripting, Excel Macros, Teradata, VB .NET.

  • Translated more than 800 existing excel rules into Microsoft platform using SSIS and TSQL from Macros/VB Scripts.

Skills: Communication · Online Transaction Processing (OLTP) · Critical Thinking · RDBMS · Computer Science · Data Modeling · OLAP · Data Engineering · PL/SQL · Technical Leadership · Business Intelligence · Requirements Gathering · Data Warehousing · Performance Tuning · Extract, Transform, Load (ETL) · SSIS · Databases · Oracle · Hadoop · Microsoft Office · Microsoft SQL Server · Unix Shell Scripting 

Technology Lead | Infosys Technologies LTD | Charlotte, North Carolina, United States | November, 2008 - July, 2014

    Participated in technical training covering various aspects of software development lifecycle and software programming. Developed ETL programs and performed unit testing, system testing and integration testing. Conducted implementation and warranty support. Created documentation based on the ETL processes as required. Utilized Informatica PowerCenter 8.6, Unix, Flat Files, Oracle, XML.

Gathered customer requirements from business team and developed functional requirements. Designed, developed, and tested SSIS packages and SQL server programming. Provided solutions to ETL design. Participated in high-level and detail-level design and documentation. Prepared test cases for unit, system, and integration testing. Supported system integration test cycle by analyzing and fixing defects along with code migration. Performed data validation and code review before deployment. Coordinated with vendors and business users during UAT. Provided guidance, coaching, and mentoring to trainees. Utilized Microsoft SQL Server 2008, SQL Server Business Intelligent Studio 2008, Unix, Teradata, Oracle, SVN.

Acted as team lead and collaborated with offshore team to design and implement robust business intelligence solutions using multiple ETL tools. Coordinated with vendors to correct any issues or defects and for support during tool upgrades. Utilized Informatica PowerCenter 9.1, Teradata, Oracle, Netezza, Mainframe, Metacenter, SQL Server, SSIS.

  • Performed metadata administrator role to satisfy auditing, data lineage and compliance policies within the firm.
  • Took charge of onboarding application / business teams into metadata management tool, giving demonstrations and facilitating training to business colleagues.

Skills: Data Governance · Communication · Online Transaction Processing (OLTP) · Critical Thinking · RDBMS · Computer Science · Data Modeling · OLAP · Data Engineering · PL/SQL · Technical Leadership · Business Intelligence · Requirements Gathering · Data Warehousing · Performance Tuning · Extract, Transform, Load (ETL) · Agile Methodologies · Informatica · SSIS · Databases · Oracle · Hadoop · Microsoft Office · Microsoft SQL Server · JIRA · Unix Shell Scripting · Metadata Management

Education

My Education Details

Grand Canyon University : Master of Science – MS, Data Science.

Cochin University of Science and Technology : Bachelor’s Degree, Electronics & Biomedical.

Skills

My Top 5 Skills

  • Data Modelling

    Conceptual, Logical and Physical Data modelling. Dimensional Modelling. ER Modelling.

  • Data Integration

    ETL(Extract, Transform and Load) processes, data pipelines. Data Lake implementation.

  • SQL

    Transact SQL (Stored procedures, functions, triggers).

  • Python

    Data integration using python inbuilt libraries such as pandas. Implementation using boto3. Integration with APIs and so on.

  • AWS

    EC2 Instances, ECS (AWS Identity and Access Management, Secrets Manager, Amazon CloudWatch), Amazon Redshift, Athena, S3 (Datalake implementation, backups, snapshots), AWS Lambda, Amazon Glue, AWS VPC, Amazon SNS

Contact Me

I Want To Hear From You

Please fill out the form below to get in touch with me.

Copyright © 2023 Manu Mathew. All rights reserved.