Sr. Data Engineer Resume
Chicago, IL
SUMMARY
- Over 9 years of Experience in Analysis, Design, Modeling, Development, Testing, Implementation and Support of business applications using ETL tools and Data Warehousing applications.
- Expertise in designing, developing, performance tuning data integration applications using ETL tools like IBM Infosphere Datastage and DBMS tools like SQL Server, Oracle and Teradata.
- Certified Tableau Desktop Specialist. Experience in creating Tableau reports, dashboards, insights data analytics and visualizations using Tableau Desktop application.
- Certified MuleSoft Developer and Architect. Experience developing on the Mule ESB platform to deliver SOAP&REST based APIs. Hands on experience designing RAML specs, building APIs using API Kit in Mule applications.
- Hands on experience interfacing disparate data from heterogeneous source systems into Enterprise Data Warehouse on variety of Data Base systems such as IBMDB2, Oracle, Teradata, and SQL Server.
- Experience in spark streaming and designing data pipelines and using BIG data technologies like Python PySpark.
- Experience working on Amazon Web Services (EC2, ELB, VPC, S3, CloudFront, IAM, RDS, Route 53, CloudWatch, SNS).
- Experience in Creating, Debugging, Scheduling and Monitoring jobs using Airflow and Oozie.
TECHNICAL SKILLS
ETL Tools: IBM Datastage 7.5/8.1/8.5 (Manager, Designer, Director, Administrator), Quality Stage, MS SQL Server Integration Service (SSIS) and Talend.
Business Intelligence: Tableau Desktop
Programing Languages: SQL, Python, PL/SQL, C++
Scripting: Korn, Bash, JavaScript, PowerShell and Node.js
BIG DATA: Hadoop, Hive, Spark & AWS
Web Services: SOAP, WSDL, JAX - WS, JAX-RS, REST
DBMS: MS SQL Server 2008, 2008 R2, 2012,2014, Teradata, Oracle 10g/9i/8i, IBM DB2 10
Scheduling Tools: Airflow, Oozie and Zena
PROFESSIONAL EXPERIENCE
Sr. Data Engineer
Confidential, Chicago IL
Responsibilities:
- Designed, developed and built SSIS Packages to Extract, Transform and Load (ETL) the data into SQL server database according to the business requirements for Treasury Operations. Reviewed and validated packages and data loaded into Treasury Enterprise Reporting Database for accuracy.
- Implemented and managed source control environment for Treasury SQL Server database using Microsoft Team Foundation Server (TFS).
- Created multiple parameterized stored procedures to process the data and generate notification emails.
- Created SSIS Package configurations to implement package deployment model following Health-tech guidelines.
- Created and managed schema objects such as tables, views, indexes, stored procedures, and triggers & maintaining Referential Integrity.
- Designed and developed Cash Position forecasting Tableau reports, dashboards, insights and visualizations using Tableau Desktop application in support of Treasury Operations.
- Automated Tableau workflow and programmatically managed Tableau Server content, including workbooks, data sources, using Tableau Server Client (TSC) API and Document API.
- Created Custom Tableau workbooks migrator tool for Treasury Department using python and Tableau Server Client (TSC) API and Document API.
- Developed Python/Bash scripts to support Tableau automated jobs and load data into Treasury Operations database. Responsible for testing and debugging issues with Tableau Reports jobs and Python scripts.
- Responsible for report rationalization ensuring metrics are consistently defined, calculated, and accessed, and for identifying opportunities for enhanced reporting automation.
- Developed Mule flows to integrate Data from various financial data sources. Defined APIs with RAML. Created RESTful interfaces for applications from RAML files. Connected API interfaces to API implementations.
- Worked on SMTP, FTP and SFTP protocols. Worked on Flat Files, XMLs, JSON in Mule ESB.
- Experience in Mule Expression Language to access and evaluate the data in the payload, properties and variables of a Mule message.
- Hands on experience designing RAML specs, building APIs using API Kit in Mule applications
Sr. Data Engineer /Technical Lead
Confidential, Chicago IL
Responsibilities:
- Responsible for Designing and implementing high performance large volume data integration process to source and process finance data from Aptitude Accounting Hub.
- Worked across the FSD Organization to create a standard set of management and operational reports.
- Designed and developed ETL process using IBM Information Server (Datastage and Director) to process daily and monthly financial data from aptitude accounting hub.
- Enabled advanced reporting, increased timeliness, improved performance analysis and enhanced information quality to FSD and other business partners.
- Coded Teradata sql and bteq scripts to create reporting views.
- Integrated and Automated daily and monthly financial extract and load process from Aptitude Accounting Hub into Reporting layer.
- Worked in collaboration with Accounting hub and Cognos team in delivering the reports in timely manner.
Sr. Data Engineer /Technical Lead
Confidential, Chicago IL
Responsibilities:
- Designed and developed ETL process using IBM Information Server (Datastage and Director) for De-Identification and Re-Identification of Membership information following PHI guidelines.
- Designed and developed ETL process for extracting Medical and Pharmacy claims from Enterprise Data warehouse (EDW).
- Coded Teradata sql and bteq scripts to create membership extracts for in scope population.
- Integrated and Automated the membership and claims data exchange process with vendors as per data exchange calendar using Egateway/Axway.
- Automated the vendor outreach and response roster file load process using Zena scheduler tool and Unix scripts.
- Worked in collaboration with Program Evaluation and Reporting teams in determining the per member per month (PMPM) calculations for the program.
Sr. ETL Developer/Data Analyst
Confidential
Responsibilities:
- Designed and developed ETL process using IBM Information Server (Datastage) for vendor outreach process to load enrollee/servicing provider information and generate vendor outreach files for vendors using HCSC E-gateway system.
- Designed and implemented supplemental vendor response file process to receive, store and validate the diagnosis code files from applicable vendors with predefined rules established for supplemental process.
- Designed and developed claims matching process to match the enrollee to the specified Servicing Provider NPI captured from the Vendor response file.
- Designed change data capture jobs to compare datasets and make record of differences to determine additional/deleted diagnosis codes from vendor response files.
- Gained experience with HHS EDGE server submission process by working on supplemental data extraction and submission for 2014 and 2015.
- Automated the vendor outreach and response load process using Zena scheduler tool and Unix scripts.
- Worked in collaboration with Risk and Revenue Optimization team to prepare documentation for Initial Validation Audit (IVA).