We provide IT Staff Augmentation Services!

Etl Developer/build Coordinator Resume

Jessup, PA


  • 9+ years of professional experience with expertise in Design, Development and Implementation of Data Warehouse applications and Database business systems.
  • Good knowledge in business verticals that include Health Care, Retail & Public Sector.
  • Deep knowledge in Mapping the functional units of FACETS to EDW model.
  • Integrated FACETS model like member, network provider, claim, member provider, billing.
  • Experience in working with various databases which include Oracle, Gemfire XD, Greenplum, Postgresql, MS SQL Server and Flat files.
  • Experience in all phases of development including Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts in Agile and waterfall methodology.
  • Proficiency in Informatica Power Center (Repository Manager, Designer, Server Manager, Workflow Manager, and Workflow Monitor), SSIS and custom C# packages.
  • Experience in maintenance and enhancement of existing data extraction, cleansing, transformation, and loading processes to improve efficiency.
  • Experience with relational and dimensional models using Facts and Dimensions tables.
  • Experience in using SSIS tools like Import and Export Wizard, Package Installation, and SSIS Package Designer.
  • Worked with Oracle Stored Procedures and experienced in loading data into Data Warehouse/Data Marts using Informatica, SQL*Loader, Export/Import utilities.
  • Expert in using Informatica Power Center 9.1/8.6.0/8.1.0/7.1 and Informatica Power Exchange for extraction, transformation and loading mechanism.
  • Extensively used Informatica Workflow manager and Workflow monitor for creating and monitoring workflows, Worklets and sessions.
  • Proficient in server side programming like stored procedures, stored functions, database triggers, packages using PL/SQL and SQL.
  • Experience in Dimensional Modeling using Star schema and Snow schema.
  • Experience in UNIX Shell Scripting and Autosys for scheduling the Workflows.
  • Experience in writing Test plans, Test cases, Unit testing, System testing, Integration testing and Functional testing.
  • Expertise in documenting the ETL process, Source to Target Mapping specifications, status reports and meeting minutes.
  • Very good understanding of ‘Versioning’ concepts and worked with SVN and Informatica versioning. Worked extensively with versioned objects and deployment groups.
  • Experienced with Database archiving processes are managed centrally for all data, whether archived on - premise or in the cloud.
  • Worked in Greenplum Integrations.
  • Good experience in data analysis, error handling, error remediation and impact analysis.
  • Experience in Agile and Waterfall methodologies.
  • Versatile team player with excellent analytical, communication and presentation skills.


ETL/DWH Tools: Informatica Power Center 9.1/8.x/7.1 Teradata 12.0 (BTEQ, Fast Load. TPUMP) SSIS Package.

Databases: Oracle 10g, 9i, 8i, MS Access, MS SQL Server 2012/2008/2005 , DB2, Teradata 14.0/12.0, Greenplum 1.16.1, Gemfire XD.

DBMS/Query Tools: TOAD, Rapid SQL, SQL Developer, WinSQL, SQL Assistant, SQL Navigator, PL/SQL Developer, pgadmin3, SQuirreL SQL Client 3.6.

Operating Systems: Microsoft Windows - Vista, XP, 2000, NT 4.0, OS/2

UNIX - Sun Solaris, HP-UX

Programming Lang: SQL, PL/SQL, Postgre SQL, UNIX Shell Scripting, T-SQL, VB Script, Java, C, C++, C#.

Data Analysis: Data Design/Analysis, Business Analysis, User Requirement Gathering, User Requirement Analysis, Gap Analysis, Data Cleansing, Data Transformations, Data Relationships, Source Systems Analysis and Reporting Analysis.


Confidential, Jessup, PA

ETL Developer/Build Coordinator


  • Worked as a Developer in the ETL Framework development of MOMs EDW which involves MS SQL Server, Gemfire XD and Greenplum as architecture stack.
  • Exposure to the custom C# package to generate the SSIS Package based on the ETL metadata framework
  • Worked with Architect to establish the data quality methodology, conduct gap analysis and design review
  • Configured the metadata frame work for converting the existing process as is to the new environment for migration.
  • Re-Engineered non-performing existing Greenplum functions into a Metadata Framework suitable for SSIS Package migration
  • Created the conversion tables in MS SQL for maintaining the internal data integrity and publish master key in the system.
  • Ensured the successful development and unit testing of the SSIS package to read the data from FACETS, transform and load into the staging table in SQL Server
  • Configured Metadata Framework to load into staging tables and then to Gemfire Tables.
  • Written DDL's, Queries, Created Indexes, Primary Key Constraints and unit testing scripts in Gemfire XD using SQuirreL.
  • Worked on Greenplum functions for creating the aggregates for the dashboard reports.
  • Wrote Greenplum functions to load data from flat files to dimension and fact tables.
  • Involved in Unit testing, Code Reviewing, Moving in SIT.
  • Centre point of contact for cross stream touch points involved in MOM's project implementation
  • Bagged appreciations throughout the entire development phase
  • Incorporated enterprise ETL development standards during the course of development lifecycle.
  • Participated in the project scrum/status meeting and providing daily status.
  • Liaised with production team to analyze bugs and resolve issues.
  • Established best practices standards and ensured adherence to them.
  • Created the Mapping document for the Greenplum Functions to SSIS Meatadata Framework.
  • Documented the design, data flow diagram, unit case document, metadata frame work design document etc.
  • Documented the MCRR (Cross stream project) process for the data load.
  • Worked in generating 837 I and P XML by customization of SSIS package an metadata tables. In-line with the Healthcare rules and regulation (HIPPA)
  • Created mapping documents to outline data flow from sources to targets mapping sheets.
  • Involved in deployment activities. Creating and Updating the Metadata DDL/Stage DDL, Meta data DML, Configuration DML etc and Update the Changes to SVN.
  • Preparation of Release Notes and Deployment form and co-ordination in getting the Approval for the Deployment from the Client's Manager.

Environment: Visual Studio 2010, MS SQL Server 2012, pgAdmin 1.16.1 for Greenplum, Gemfire XD, SQuirreL SQL Client 3.6, ADO .net, Flat Files, CSV files, Postgre SQL (Functions).

Confidential, Dallas,TX

Sr. Informatica Developer


  • Designed, Developed and maintaining Socrates project end to end and got best performance award.
  • Centre point of contact for multiple high visible projects with high complex business functionality.
  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Parsed high-level design specification to simple ETL coding and mapping standards.
  • Designed and customized data models for Data warehouse supporting data from multiple sources on real time.
  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Created mapping documents to outline data flow from sources to targets.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
  • Worked with Data Quality architect and administrator to establish a data quality methodology, documenting a repeatable set of processes for determining, investigating, and resolving data quality issues, and established an on-going process for maintaining quality data.
  • Designed and developed advanced reusable UNIX shell scripts for ETL auditing, error handling and automation.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Developed mapping parameters and variables to support SQL override.
  • Created mapplets & reusable transformations to use them in different mappings.
  • Developed mappings to load into staging tables and then to Dimensions and Facts.
  • Used existing ETL standards to develop these mappings.
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, Worklets, Assignment, Timer and scheduling of the workflow.
  • Implemented slowly changing dimension Type 1 and Type 2 for Change data capture using Version control.
  • Developed advanced reusable mapplets for SCD and fact loading.
  • Involved in Unit testing, code reviewing, moving in UAT and PROD.

Environment: Informatica Power Center 8.6.1, Oracle 10g, Flat Files, CSV files, PL/SQL(Stored Procedure, Trigger, Packages),UNIX, Toad.


Sr.Informatica Developer


  • Worked with various Databases for extracting the files and loading them into different databases.
  • Designing the ETLs and conducting review meetings.
  • Played a lead role in managing the offshore team.
  • Worked mainly on troubleshooting the errors which were occurred during the loading process.
  • Created stored procedures, triggers, tables, indexes, rules, etc. as needed to support extraction, transformation and load (ETL) processes.
  • Designed and developed integrations to export the customer data from the EDW to Greenplum DW.
  • Designed and developed integrations to import back the customer scores from flat files generated by Greenplum.
  • Created a replication process to keep re-engineered database in sync with legacy database.
  • Extensively worked on creating mapping parameters and variables which are used in various workflows for reusability.
  • Worked with various active transformations in Informatica Power Center like Filter Transformation, Aggregator Transformation, Joiner Transformation, Rank Transformation, Router Transformation, Sorter Transformation, Source Qualifier, and Update Strategy Transformation.
  • Extensively worked with various Passive transformations in Informatica Power Center like Expression Transformation, and Sequence Generator.
  • Extensively worked with Slowly Changing Dimensions Type1, Type2, for Data Loads.
  • Responsible for Performance Tuning at the Mapping Level, Session Level, Source Level, and the Target Level.
  • Involved in writing Stored Procedures in SQL and extensively used this transformation in writing many scenarios as per the requirement.
  • Worked with re-usable objects like Re-Usable Transformation and Mapplets.
  • Extensively worked with aggregate functions like Avg, Min, Max, First, Last in the Aggregator Transformation.
  • Extensively used SQL Override function in Source Qualifier Transformation.
  • Extensively used Normal Join, Full Outer Join, Detail Outer Join, and Master Outer Join in the Joiner Transformation.
  • Extensively worked with Incremental Loading using Parameter Files, Mapping Variables, and Mapping Parameters.
  • Written Queries, Procedures, created Indexes, Primary Keys and Database testing.
  • Defects were analyzed, fixed, tested, tracked and reviewed.
  • Implemented various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows.
  • Used Source Analyzer and Warehouse Designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.
  • Involved in performance tuning and monitoring (both SQL and Informatica) considering the mapping and session performance issues.

Environment: Informatica Power Center 9.1, Oracle 10g, Web services, MS Access 2010, SQL*Loader, UNIX, Winscp, Putty, SQL Developer, PL/SQL.


ETL Informatica Developer


  • Worked with Business analysts to understand business/system requirements in order to transform business requirements into effective technology solutions by creating Technical Specifications (Source to Target Documents) for the ETL from the Functional Specifications.
  • Worked on Informatica tools like Designer, Workflow Manager, and Workflow Monitor .
  • Worked mostly on Lookup, Aggregator, and Expression Transformations to implement complex logics while coding a Mapping.
  • Worked with Memory cache for static and dynamic cache for the better throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations.
  • Monitoring the Jobs, observing the performance of the individual transformations and tuning the code (Performance Tuning).
  • Migrating objects between different Environments using XML Export/Import (using Repository Manager ).
  • Developed code to load the data from Flat File to stage and stage to ODS .
  • Developed mappings with XML as target and formatting the target data according to the requirement.
  • Designed and developed Informatica mappings for data loads and data cleansing .
  • Created mappings to in corporate Incremental loads .
  • Developed reusable Mapplets to include the Audit rules.
  • Working with Power Center Versioning (Check-in, Check-out), Querying to retrieve specific objects, maintaining the history of objects.
  • Designed the Workflow for the ODS load following the load Dependencies.
  • Involved in fixing Invalid Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batches and Target Data.
  • Solely responsible for the daily loads and handling the reject data .
  • Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the requirements.
  • Creating stored procedures and sequences to insert key into the database table.
  • Involved in writing PERL scripts for file transfers, file renaming and few other database scripts to be executed from UNIX .
  • Extensively Used Perl Scripting to schedule the Informatica Jobs.
  • Maintained/Fixed bugs in existing PERL scripts.
  • Guided the testing team and the development team and monitored the Implementation.
  • Provided support for the Nightly jobs.

Environment: Informatica Power Center 8.6.1, Oracle 10g, PL/SQL Developer, XML, Windows XP, Rational Clear quest, PERL,TOAD.

Hire Now