We provide IT Staff Augmentation Services!

Data Solutions Engineer Resume

Denver, ColoradO


  • Over 1 8+ years of experience in software applications/systems Integration/data warehouse/data marts and database development playing different roles in Technology department.
  • Worked as ETL Lead, Solutions Architect, Senior ETL developer, Programmer/Analyst, Technical Implementation lead, Application admin, Tech Support, I.T Coordinator, Software Engineer with Strong technical and communication skills, Creative solution provider, Confident, independent with leadership skills, good decision - making abilities.
  • Over Three years of experience in SAP BODS data Integration. Five plus years of experience with Informatica power center and 10 years of experience with Oracle and Sql. Three years working experience with ASP .NET in client-server development tools and technologies
  • Proven track record of successful implementation of all project life cycle stages from requirements gathering, Planning, creating Application Design Specifications, coding, release process, implementation, support and enhancements.
  • Experience in data warehouse, data mart design and coding using I nformatica Power Centre and SAP BODS data services as ETL Tools and developing ETL package using PL SQL .
  • Technical expertise in developing Relational Database Management Systems (RDBMS) such as Oracle 12c/10g etc, PL/SQL Programming, SQL Server. Load data to VERTICA
  • Expertise in writing Python, UNIX shell scripts, Windows Powershell, and VBS scripts for development and maintenance.
  • Passionate towards performance tuning of Informatica and SAP BODS jobs/workflows/data flows/mapping.
  • Experience in developing Stored Procedures, Packages in Oracle 12c, 9i,10g and SQL Server 2005 & 2008.
  • Integration of financial data and Data extraction from Facets tool for healthcare ETL system using Informatica to Vertica, oracle.
  • Experience applying Object Oriented Programming principles (OOPS) in web application development
  • Analysis and design of Workflow Process and technical support for completion of workflow task.
  • Lead the offshore/onsite developers and technical support team on various projects.
  • Extensive knowledge in Dimensions and facts designing for Data warehouse\ Data Mart.
  • Hands on Mulesoft, Alteryx, completed form MuleSoft in 2017.


Tools/Utilities: Informatica 9.1, Sap BODS Data Integrator, Autosys, M ove it, SQL Data modeler, Git lab,Mulesoft, Harvest, TOAD, SQL Loader, ProArc, Microsoft site server, Brava Server (Tomcat), Microsoft Project, Net-It, Facets

Databases: Vertica, ORACLE,12c,10g, 9.i, PL/SQL, SQL server 2008, 2005, 2010.,

Internet Technologies/GUI: ASP.Net, HTML, VB. Net, Visual studio, Visual Basics

Script: Python 3.0, Unix Shell script, VB shell script, powershell scripts


Confidential, Denver, Colorado

Data Solutions Engineer


  • Analyzed data coming from legacy system for salespage integration and create process to extract data from sales page to Oracle database for downstream reporting systems.
  • Developed ETL and ELT using Informatica to load data from sales page flat files to Oracle Database tables with incremental loads and full loads logic.
  • Designed and implemented marketing data loads to Vertica using Informatica with source as oracle and json files with csv conversion using python scripts.
  • Created jobs in Autosys a for automation of data loads alo ng with MOVEIT jobs.
  • Developed materialized views, views for downstream data processing from partition tables.
  • Troubleshoot and fix Informatica issues, production support, provide solutions and mentor the developers on the team.
  • Worked on code optimization and performance tuning for ETL with Informatica to oracle and Vertica loads and IBM mainframe legacy systems .
  • Participate in team code reviews and walkthrough with the Architect team following the best industry standard practices.
  • Regularly participate in release management process to deployment code from dev to QA, UAT to Prod using SVN, GIT.
  • Created the project and mapping documentation for QA testing.
  • Created the data Validation queries to check the data loads.
  • Fetched data from SQL server SSIS packages. Coming from source for sales page.
  • Created P ython scripts for data conversions and UNIX shell scripts to pull the delta files for hourly load and load via Informatica workflows .

Confidential, Denver, Colorado

Sr. Software Developer/Solutions Architect


  • Implementation of Data Mart for client reporting
  • Support and Maintenance of data warehouse and Data Mart using SAP BODS as ETL tool
  • Developed Informatica Workflows to populate data into Oracle based Data Warehouse and Data Marts
  • Implemented the process to load historical database dump files per wave through UNIX scripts and further load data from SAP BODS workflows to Datamart.
  • Designed and develop Business Object data services Jobs to integrate the historical data through control tables and further perform key conversions and load to data mart.
  • Played Tech Lead role on multiple Data warehouse, Data Mart and other systems for development, testing and implementation support with onsite and offshore teams.
  • Designed a data mart conceptual and physical data model and tables with partitioning, indexing options for the Client reporting using SQL data modeler.
  • Created the Roles hierarchy, grants and database sizing for new data mart implementation.
  • Developed the Snowflake schema structure for new data mart.
  • The column level security implemented using Oracle VPD Policies, created procedures and function to get security keys for each user.
  • Implemented the design and technique to load data in bulk or in parts at specific lowest grain in data warehouse.
  • Developed and tuned data services jobs to load monthly snapshots from data warehouse to data mart.
  • Performed and mentor the team for tuning of jobs to finish monthly loads to meet the SLA.
  • Type 2 dimension concept used with History preserving transformation with proper business logic.
  • Created data mapping for golden gate for oracle 12c
  • Developed data reconciliation script to check loaded data.
  • Provided different solutions for fixing the data in data warehouse for production issues.
  • Created Jobs, workflows, data flows in SAP Business object Data services tool.
  • Implemented the Jira scrum lifecycle for the team and used GIT Lab for source code.
  • Used the different concepts of oracle Swapping of partitions to tune the Etl jobs.
  • Implemented code review system and standard to be followed for development.
  • Production support on rotation basis for daily batch and Monthly batch.
  • Developed and supported Informatica ETL Workflows and session for Finance team to send the balance data.
  • Worked on Informatica optimization and performance tuning for the finance team.
  • Developed the ETL mappings to load data from CRM Salesforce to data warehouse.
  • Worked on Hardik clusters for Proof of concept for implementation option.

Confidential, Denver, CO

Sr. Informatica Developer


  • Designed and Developed complex ETL processes for Data warehouse using Informatica 9.1 and Oracle 11g.
  • Worked directly with business users to define business processes and requirements for complex ETL’s.
  • Created Application Design Specification by gathering requirements from business analyst and worked on receiving approvals from various business and technical divisions.
  • Developed and coded several complex mappings to extract data from Facets database using different transformations applying business logic.
  • Developed a reconciliation process from current extraction to create the file as per requirement and FTP to different groups.
  • Improved the performance of few live production workflows by using the techniques of performance tuning.
  • An extraction process in Informatica was developed from which various reports were generated on daily basis.
  • Added new daily scheduled job process to the current scheduled job system.
  • Automated a few manual update jobs using informatica.
  • Prepared and executed test scripts for all levels of testing.
  • Provided technical guidance in the areas of coding, testing, QA and post and pre implementation.
  • Actively participated in testing and assists with troubleshooting of issues, to include test scripts, data recovery in case of loss, and ensuring timely issue resolution
  • Implementation support to QA and PROD environments.
  • Creating Harvest Packages for Informatica deployment.

Confidential, Houston

Technical Analyst- ETL Developer, BCP Lead


  • Designed ETL process for Data warehouse using Informatica and PL/SQL.
  • Developed ETL integrations using Informatica Powercenter 8.6 using various transformations to export/write data to data warehouse databases.
  • Attend the sizing meeting and daily update meeting with team for Agile scrum methodology.
  • Worked in various versioned repositories within Informatica such as Dev./QA/Production
  • Create documentation for ETL processes to mentor Testing team.
  • Worked on performance tunings in Informatica.
  • Support production release for ETL process.
  • Creating Stored Procedures using PL/SQL and complex queries for day to day work and reports on Oracle 10g database using TOAD.
  • Designed, developed and implemented the 3 hour BCP (Business Continuity Plan/Disaster Recovery Plan) plan for Dot net application with automation process.
  • Integration of IIS 6.0 Metabase.xml files to import website from one server to another with scripts, Automation of IIS backup. Designed Backup and restore plan for Oracle 10g database.
  • Developed vbs script to monitor Server components and create events in system with alerts to Admin groups and automation of IIS worker process.
  • Used vbs scripting and powershell scripting for various alert notification form server, designed batch files and scheduled task.
  • Create the pl/sql Packages & procedures for data extraction.
  • Doctrax: Supporting a web based asp application developed for document tracking.

Hire Now