We provide IT Staff Augmentation Services!

Etl Developer/build Coordinator Resume

4.00/5 (Submit Your Rating)

Dallas, TX

SUMMARY

  • 9+ years of professional experience with expertise in Design, Development and Implementation of Data Warehouse applications and Database business systems.
  • Good knowledge in Health Care domain.
  • Deep knowledge in Mapping the functional units of FACETS to EDW model.
  • Integrated FACETS model like member, network provider, claim, member provider, billing
  • Experience in working with various databases which include Oracle, Gemfire XD, Greenplum, Postgresql, MS SQL Server and Flat files.
  • Experience in all phases of development including Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts in Agile and waterfall methodology.
  • Proficiency in Informatica Power Center (Repository Manager, Designer, Server Manager, Workflow Manager, and Workflow Monitor), SSIS and custom C# packages.
  • Experience in maintenance and enhancement of existing data extraction, cleansing, transformation, and loading processes to improve efficiency.
  • Experience with relational and dimensional models using Facts and Dimensions tables.
  • Experience in usingSSIStools like Import and Export Wizard, Package Installation, and SSIS Package Designer.
  • Worked with Oracle Stored Procedures and experienced in loading data into Data Warehouse/Data Marts using Informatica, SQL*Loader, Export/Import utilities.
  • Expert in using Informatica Power Center 9.1/8.6.0/8.1.0/7.1 and Informatica Power Exchange for extraction, transformation and loading mechanism.
  • Extensively used Informatica Workflow manager and Workflow monitor for creating and monitoring workflows, Worklets and sessions.
  • Proficient in server side programming like stored procedures, stored functions, database triggers, packages using PL/SQL and SQL.
  • Experience in Dimensional Modeling using Star schema and Snow schema.
  • Experience in UNIX Shell Scripting and Autosys for scheduling the Workflows.
  • Experience in writing Test plans, Test cases, Unit testing, System testing, Integration testing and Functional testing.
  • Expertise in documenting the ETL process, Source to Target Mapping specifications, status reports and meeting minutes.
  • Very good understanding of ‘Versioning’ concepts and worked with SVN and Informatica versioning. Worked extensively with versioned objects and deployment groups.
  • Experienced with Database archiving processes are managed centrally for all data, whether archived on - premise or in the cloud.
  • Worked in Greenplum Integrations.
  • Good experience in data analysis, error handling, error remediation and impact analysis.
  • Experience in Agile and Waterfall methodologies.
  • Versatile team player with excellent analytical, communication and presentation skills.

TECHNICAL SKILLS

ETL/DWH Tools: Informatica Power Center 9.1/8.x/7.1 Teradata 12.0 (BTEQ, Fast Load. TPUMP) SSIS Package.

Databases: Oracle 10g, 9i, 8iMS Access, MS SQL Server 2012/2008/2005 , DB2, Teradata 14.0/12.0, Greenplum 1.16.1, Gemfire XD.

DBMS/Query Tools: TOAD, Rapid SQL, SQL Developer, WinSQL, SQL Assistant, SQL Navigator, PL/SQL Developer, pgadmin3, SQuirreL SQL Client 3.6.

Operating Systems: Microsoft Windows - Vista, XP, 2000, NT 4.0, OS/2 UNIX - Sun Solaris, HP-UX

Programming Lang: SQL, PL/SQL, Postgre SQL, UNIX Shell Scripting, T-SQL, VB Script, Java, C, C++, C#.

Data Analysis: Data Design/Analysis, Business Analysis, User Requirement Gathering, User Requirement Analysis, Gap Analysis, Data Cleansing, Data Transformations, Data Relationships, Source Systems Analysis and Reporting Analysis.

PROFESSIONAL EXPERIENCE

Confidential

ETL Developer/Build Coordinator

Responsibilities:

  • Worked as a Developer in the ETL Framework development of MOMs EDW which involves MS SQL Server, Gemfire XD and Greenplum as architecture stack.
  • Exposure to the custom C# package to generate the SSIS Package based on the ETL metadata framework
  • Worked with Architect to establish the data quality methodology, conduct gap analysis and design review
  • Configured the metadata frame work for converting the existing process as is to the new environment for migration.
  • Re-Engineered non-performing existing Greenplum functions into a Metadata Framework suitable for SSIS Package migration
  • Created the conversion tables in MS SQL for maintaining the internal data integrity and publish master key in the system.
  • Ensured the successful development and unit testing of the SSIS package to read the data from FACETS, transform and load into the staging table in SQL Server
  • Configured Metadata Framework to load into staging tables and then to Gemfire Tables.
  • Written DDL's, Queries, Created Indexes, Primary Key Constraints and unit testing scripts in Gemfire XD using SQuirreL.
  • Worked on Greenplum functions for creating the aggregates for the dashboard reports.
  • Wrote Greenplum functions to load data from flat files to dimension and fact tables.
  • Involved in Unit testing, Code Reviewing, Moving in SIT.
  • Centre point of contact for cross stream touch points involved in MOM's project implementation
  • Bagged appreciations throughout the entire development phase
  • Incorporated enterprise ETL development standards during the course of development lifecycle.
  • Participated in the project scrum/status meeting and providing daily status.
  • Liaised with production team to analyze bugs and resolve issues.
  • Established best practices standards and ensured adherence to them.
  • Created the Mapping document for the Greenplum Functions to SSIS Meatadata Framework.
  • Documented the design, data flow diagram, unit case document, metadata frame work design document Confidential .
  • Documented the MCRR (Cross stream project) process for the data load.
  • Worked in generating 837 I and P XML by customization of SSIS package an metadata tables. In-line with the Healthcare rules and regulation (HIPPA)
  • Created mapping documents to outline data flow from sources to targets mapping sheets.
  • Involved in deployment activities. Creating and Updating the Metadata DDL/Stage DDL, Meta data DML, Configuration DML Confidential and Update the Changes to SVN.
  • Preparation of Release Notes and Deployment form and co-ordination in getting the Approval for the Deployment from the Client's Manager.

Environment: Visual Studio 2010, MS SQL Server 2012, pgAdmin 1.16.1 for Greenplum, Gemfire XD, SQuirreL SQL Client 3.6, ADO .net, Flat Files, CSV files, Postgre SQL (Functions).

Confidential, Dallas,TX

Sr. Informatica Developer

Responsibilities:

  • Designed, Developed and maintaining Socrates project end to end and got best performance award.
  • Centre point of contact for multiple high visible projects with high complex business functionality.
  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Parsed high-level design specification to simple ETL coding and mapping standards.
  • Designed and customized data models for Data warehouse supporting data from multiple sources on real time.
  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Created mapping documents to outline data flow from sources to targets.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
  • Worked with Data Quality architect and administrator to establish a data quality methodology, documenting a repeatable set of processes for determining, investigating, and resolving data quality issues, and established an on-going process for maintaining quality data.
  • Designed and developed advanced reusable UNIX shell scripts for ETL auditing, error handling and automation.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Developed mapping parameters and variables to support SQL override.
  • Created mapplets & reusable transformations to use them in different mappings.
  • Developed mappings to load into staging tables and then to Dimensions and Facts.
  • Used existing ETL standards to develop these mappings.
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, Worklets, Assignment, Timer and scheduling of the workflow.
  • Implemented slowly changing dimension Type 1 and Type 2 for Change data capture using Version control.
  • Developed advanced reusable mapplets for SCD and fact loading.
  • Involved in Unit testing, code reviewing, moving in UAT and PROD.

Environment: Informatica Power Center 8.6.1, Oracle 10g, Flat Files, CSV files, PL/SQL(Stored Procedure, Trigger, Packages),UNIX, Toad.

Confidential

Sr.Informatica Developer

Responsibilities:

  • Worked with various Databases for extracting the files and loading them into different databases.
  • Designing the ETLs and conducting review meetings.
  • Played a lead role in managing the offshore team.
  • Worked mainly on troubleshooting the errors which were occurred during the loading process.
  • Created stored procedures, triggers, tables, indexes, rules, Confidential . as needed to support extraction, transformation and load (ETL) processes.
  • Designed and developed integrations to export the customer data from the EDW to Greenplum DW.
  • Designed and developed integrations to import back the customer scores from flat files generated by Greenplum.
  • Created a replication process to keep re-engineered database in sync with legacy database.
  • Extensively worked on creating mapping parameters and variables which are used in various workflows for reusability.
  • Worked with various active transformations in Informatica Power Center like Filter Transformation, Aggregator Transformation, Joiner Transformation, Rank Transformation, Router Transformation, Sorter Transformation, Source Qualifier, and Update Strategy Transformation.
  • Extensively worked with various Passive transformations in Informatica Power Center like Expression Transformation, and Sequence Generator.
  • Extensively worked with Slowly Changing Dimensions Type1, Type2, for Data Loads.
  • Responsible for Performance Tuning at the Mapping Level, Session Level, Source Level, and the Target Level.
  • Involved in writing Stored Procedures in SQL and extensively used this transformation in writing many scenarios as per the requirement.
  • Worked with re-usable objects like Re-Usable Transformation and Mapplets.
  • Extensively worked with aggregate functions like Avg, Min, Max, First, Last in the Aggregator Transformation.
  • Extensively used SQL Override function in Source Qualifier Transformation.
  • Extensively used Normal Join, Full Outer Join, Detail Outer Join, and Master Outer Join in the Joiner Transformation.
  • Extensively worked with Incremental Loading using Parameter Files, Mapping Variables, and Mapping Parameters.
  • Written Queries, Procedures, created Indexes, Primary Keys and Database testing.
  • Defects were analyzed, fixed, tested, tracked and reviewed.
  • Implemented various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows.
  • Used Source Analyzer and Warehouse Designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.
  • Involved in performance tuning and monitoring (both SQL and Informatica) considering the mapping and session performance issues.

Environment: Informatica Power Center 9.1, Oracle 10g, Web services, MS Access 2010, SQL*Loader, UNIX, Winscp, Putty, SQL Developer, PL/SQL.

Confidential

ETL Informatica Developer

Responsibilities:

  • Worked with Business analysts to understand business/system requirements in order to transform business requirements into effective technology solutions by creating Technical Specifications (Source to Target Documents) for the ETL from the Functional Specifications.
  • Worked on Informatica tools like Designer, Workflow Manager, and Workflow Monitor.
  • Worked mostly on Lookup, Aggregator, and Expression Transformations to implement complex logics while coding a Mapping.
  • Worked with Memory cache for static and dynamic cache for the better throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations.
  • Monitoring the Jobs, observing the performance of the individual transformations and tuning the code (Performance Tuning).
  • Migrating objects between different Environments using XML Export/Import (using Repository Manager).
  • Developed code to load the data from Flat File to stage and stage to ODS.
  • Developed mappings with XML as target and formatting the target data according to the requirement.
  • Designed and developed Informatica mappings for data loads and data cleansing.
  • Created mappings to in corporate Incremental loads.
  • Developed reusable Mapplets to include the Audit rules.
  • Working with Power Center Versioning (Check-in, Check-out), Querying to retrieve specific objects, maintaining the history of objects.
  • Designed the Workflow for the ODS load following the load Dependencies.
  • Involved in fixing Invalid Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batches and Target Data.
  • Solely responsible for the daily loads and handling the reject data.
  • Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the requirements.
  • Creating stored procedures and sequences to insert key into the database table.
  • Involved in writing PERL scripts for file transfers, file renaming and few other database scripts to be executed from UNIX.
  • Extensively Used Perl Scripting to schedule the Informatica Jobs.
  • Maintained/Fixed bugs in existing PERL scripts.
  • Guided the testing team and the development team and monitored the Implementation.
  • Provided support for the Nightly jobs.

Environment: Informatica Power Center 8.6.1, Oracle 10g, PL/SQL Developer, XML, Windows XP, Rational Clear quest, PERL,TOAD.

Confidential

Informatica Developer

Responsibilities:

  • Supported over 350 mappings for enhancements and issues.
  • Understood complex business requirements and converted them into technical requirements.
  • Involved in all phases of SDLC - requirement gathering, design, development, testing and implementation and post-production support.
  • Worked with almost all Transformations like Stored Procedure, Connected and Unconnected lookups, Update Strategy, Filter, Router, Expression, Aggregator, Joiner Confidential .
  • Written simple and complex SQL-overrides in source qualifier as well as look-up transformations according to the Business Requirements.
  • Developed Informatica mappings and Tuned them for Better Performance and validate the data.
  • Worked on Version Control in Informatica to maintain multiple versions of an object, control development on the object and track changes.
  • Collaborated with business, technical architect groups, development teams, unit testing and QA teams to ensure that the business requirements are implemented correctly.
  • Responsible for Informatica code deployment/migration from development to pre- production.
  • Analyzed Mapping, session and system bottlenecks to improve performance of various ETL jobs.
  • Provided post-production support and prepare support documents.
  • Authored and maintained all technical documentation.
  • Participated in the project status meeting and providing weekly status.
  • Liaised with production team to analyze bugs and resolve issues.
  • Established best practices standards and ensured adherence to them.

Environment: Informatica Power Center 8.1.0, Toad, SQL Server 2005, IBM DB2, Teradata 12.0, Erwin Oracle 10g, SQL, Scrum Works Portal, Windows XP Professional.

Confidential

Informatica Developer

Responsibilities:

  • Involved in interacting with Business Analyst in analyzing the requirements
  • Involved in documenting the Technical and Functional Documents like High Level Design, Low Level Design and System Specification Document.
  • Developed ETL mappings, Transformations and Loading using Informatica Power Center.
  • Extensively Performed the data validations and profiling using IDQ.
  • Extensively worked on Informatica Designer components Source Analyzer, Target Designer, Transformation Developer, Mapping Designer and Mapplet Designer.
  • Developed various mappings to load data from various sources using different Transformations including Router, Aggregator, Joiner, Lookup, Update Strategy, Stored Procedure, Sorter, Filter, Source Qualifier, Expression, Union and Sequence Generator to store the data in target tables.
  • Developed mapping based on the mapping specification document that indicates the source tables, columns, data types, transformation required, business rules, target tables, target columns and data types.
  • Involved in Unit Testing, Integration Testing and End-End Testing.
  • Captured the Unit Test Results and documented the test results.
  • Used workflow manager to create workflows, sessions, and also used various tasks like command, email.
  • Created Workflows, Worklets and Tasks to schedule the loads at required frequency using Workflow Manager.
  • Modified existing Unix Shell Scripts as per the requirements.
  • Wrote complex SQL Queries to comply with the conditions specified in extracting data from the TD Data Mart.
  • Have used BTEQ, FEXP, FLOAD, MLOAD Teradata utilities to export and load data to/from Flat files.
  • Designed and implemented Oracle 10g database object to support interface (Table, views, materialized views).
  • Defined Target Load Order Plan for loading data into different Target Tables.
  • Used debugger extensively to identify the bottlenecks in the mappings.
  • Extensively worked with various Lookup caches like Static cache, Dynamic cache and Persistent cache.
  • Worked on performance tuning and optimization of the Sessions, Mappings, Sources and Targets.
  • Used Informatica Power Center for mapping and workflow design including the use of mapplets, reusable transformations and worklets.
  • Migrated Objects, mappings, workflows from Dev environment to QA environment.
  • Replicated operational tables into staging tables, to transform and load data into one single database using Informatica.

Environment: Informatica PowerCenter 7.1.4 (Repository Manager, PowerCenter Designer, Workflow Manager, Workflow Monitor), Oracle 10g, Teradata 12.0, Unix, Window 7/XP, JIRA, AUTOSYS, SQL Developer 3.0, Flat Files, SQL, PL/SQL, Informatica Power Exchanger, Microsoft tools, Putty, Winscp.

Confidential

ETL Developer

Responsibilities:

  • Worked as an Informatica Developer.
  • Extensively worked in data Extraction, Transformation and Loading from Source to target.
  • Involved in analysis, design & testing environment.
  • Used Source Analyzer and Warehouse Designer to import the source and target database schemas, and the mapping designer to map source to the target.
  • Used Transformation Developer to create the Joiner, filters, Router, lookups, Expressions, Aggregation transformations, Update strategy, Stored Procedure Transformations used in mappings.
  • Created and executed sessions using Workflow Manager.
  • Developed reusable mapplets using mapplet designer.
  • Understanding existing business model and customer requirements.
  • Involved in preparation and execution of the unit, integration and end to end test cases.
  • Created multiple universes and resolved loops by creating table aliases and contexts.
  • Used session partitions, Dynamic cache memory and Index caches for improving performance of Informatica server.
  • Extracted data from SQL server Source Systems and loaded into Oracle Target tables.
  • Involved in writing shell scripts for automating pre-session, post-session processes and batch execution at required frequency using power center server manager.
  • Involved in the loading and Scheduling of jobs to be run in the Batch process.
  • Optimized and performed Tuning in mappings to achieve higher response times.
  • Involved in the migration of existing ETL process to Informatica Power center.
  • Created effective Test data and developed thorough Unit test cases to ensure successful execution of the data loading processes.
  • Created reports using business object functionality like queries, slice and dice, drill down, functions and formulas.

Environment: Informatica Power Center 7.1, Windows 2000, Solaris (SunOS 5.8), Oracle 9, Toad 7.6, Putty, Filezilla, SqlPlus.

We'd love your feedback!