We provide IT Staff Augmentation Services!

Snr. Etl Informatica Developer / Analyst Resume

2.00/5 (Submit Your Rating)

Boston, MA

SUMMARY:

  • 8 years of IT experience in Data warehousing with emphasis on Business Requirements, Application Design and Development, testing, implementation and maintenance of Data Warehouse.
  • Involved in all phases of SDLC from analysis and planning to development and deployment.
  • Experience in OLTP Modeling and OLAP Dimensional modeling (Star and Snow Flake) using ERWIN (conceptual, logical and physical data models).
  • Strong experience developing complex mappings using transformations like Source Qualifier, Filter, Expression, Joiner, Router, Union, Unconnected / Connected Lookups and Aggregator.
  • Experience in Slowly Changing Dimensions (Type 1, Type 2 and Type 3), OLTP, OLAP and Surrogate keys.
  • Designed and Developed ETL logic for implementing CDC (Change Data Capture).
  • Extensive experience with all tasks in workflow manager to implement job control in Informatica to support dependencies in loading data in target systems including Pre - Post session commands/SQL.
  • Strong skills in SQL, PL/SQL packages, functions, stored procedures, triggers and materialized views to implement business logic in oracle database.
  • Experience with the healthcare data in HIPPA formats including NDC, DRG, CPT, NCPDP, NSF code,ICD 10,ICD 9, 837,834,835.
  • Working experience on databases Oracle, Teradata, SQL Server, MySQL, Sybase, Netezza and interfaces like SQL Loader, TOAD,Teradata SQL assistant/ console and ERWIN.
  • Expertise in Performance Tuning of sources, targets, transformations and sessions.
  • Experience on Quality Assurance Testing. Logging defects, providing verifications on fixes in multiple environments and communicating fix success, failures and status updates as appropriate using Bug tracking tools like Test Director and Quality Center.
  • Experience in UNIX working environment, writing UNIX shell scripts for Informatica pre & post session operations.
  • Excellent interpersonal and communication skills, technically competent and result-oriented with problem solving skills and ability to work effectively as a team member as well as independently.

TECHNICAL SKILLS:

ENVIRONMENT: S: Windows, UNIX, Linux, Mainframes

ETL Tools: Informatica, SQL Server SSIS

DATABASES: Oracle, SQL Server, Teradata, DB2, VSAM on Mainframes

BI TOOLS: COGNOS, Tableau, Business Objects, SSRS

PROFESSIONAL EXPERIENCE:

Confidential, Boston, MA

Snr. ETL Informatica Developer / Analyst

Responsibilities:

  • Worked with Product Analyst and business users to clarify requirements and translate the requirement into technical specifications. Involved in business analysis and technical design sessions with business and technical staff to develop requirements document, and ETL specifications.
  • Built Informatica servers and setup the Development, QA and PROD environments.
  • Pulled data from Mainframes and loaded into SQL Server.
  • Created Informatica Instance on Microsoft Azure Server.
  • Extraction, Transformation and Load was performed using Informatica Power Center to build Data warehouse. Worked on Informatica power center tools like Source Analyzer, Warehouse Designer, and Mapping Designer.
  • Created Data Maps with Binary file into one to one and one Binary file to multiple COBOL copybooks Data Maps.
  • Developed standard and re-usable mappings, mapplets using various transformations like expression, aggregator, joiner, source qualifier, lookup, and Router. Also, developed mappings using parameters and variables.
  • Used reverse engineering in Erwin to understand the existing data model of the data warehouse.
  • Involved in Relational and Dimensional Data Modeling Techniques to design ERWIN data models.
  • Worked extensively on Informatica designer to design a robust end-to-end ETL process involving complex transformation likeSource Qualifier,Lookup,Update Strategy,Router, Aggregator, Sequence Generator, Filter, Expression, Stored Procedure, External Procedure, Transactional Control for the efficient extraction, transformation and loading of the data to the staging and then to the Data Mart (Data Warehouse) checking the complex logics for computing the facts.
  • Worked extensively on the HIPPA transactions as the source data like 834,835, 277,276 and more.
  • Involved in analyzing the ICD for the data mapping in source and target level.
  • Involved in analyzing different modules of facets system and EDI interfaces to understand the source system and source data.
  • Extensively used the reusable transformation, mappings and codes using Mapplets for faster development and standardization.
  • Used reusable Session for different level of workflows.
  • Created mapping document documents based on the requirement.
  • Created transformation rules from source to target as per the requirement.
  • Created and scheduled Sessions and Batches through the Informatica Server Manager. Designed and documented validation rules, error handling and test strategy of ETL process.
  • Tuned Informatica mappings/sessions for better ETL performance by eliminating bottlenecks. Used Informatica ETL to load data from flat files, which includes fixed-length as well as delimited files and SQL Server to the Data mart on Oracle database.
  • Extracted data from multiple data sources like, DB2, VSAM file (Binary) on Mainframes and loaded into staging tables.
  • Worked on complex Source Qualifier queries, Pre and Post SQL queries in the Target.
  • Created mappings, complex workflows with multiple sessions, worklets with consecutive/concurrent sessions for loading fact and dimension tables into data mart presentation layer.
  • Conducted performance tuning of Informatica components for daily and monthly incremental loading tables.
  • Used Workflow Manager for creating, validating, testing and running sequential and concurrent batches.
  • Created Identity Insert in the Informatica connection for the required tables to insert their own sequence no.
  • Implemented source and target based partitioning for existing workflows in production to improve performance so as to cut back the running time.
  • Analyzed workflow, session, event and error logs for trouble shooting Informatica ETL process.
  • Worked with Informatica Debugger to debug the mappings in Informatica Designer.
  • Involved in creating test plans, test cases to unit test Informatica mappings, sessions and workflows.
  • Migrated Informatica ETL application and Database objects through various environments such as Development, Testing and Production environments.

Environment: Informatica Power Center, Teradata, VSAM, DB2, Mainframes, SQL Server Management Studio, Oracle, Windows, Microsoft Azure.

Confidential, Westfield, OH

Snr. BI / ETL Developer

Responsibilities:

  • Designed ETL functional specifications and converted them into technical specifications.
  • Participating in user meetings, gathering requirements, discussing the data issues with end users. Translating user inputs into ETL design docs.
  • Preparing High Level Design documents.
  • Preparing source to target mappings documents and mappings development.
  • ETL mappings development & bug fixing.
  • Modifying Mapping workflows using Informatica PowerCenter as per the Change Requests raised by client
  • Used various transformations like filter, expression, sequence generator, update strategy, joiner, router, and aggregator.
  • Configuring and running sessions, tasks, and workflows.
  • Migrating Informatica workflows/folders from one environment to another.
  • Responsible for maintaining repository backups and their restorations.
  • Defined relationships and cardinality between different database tables.
  • Worked with DBA to setup the new Databases and Modify Databases for Reporting Models.
  • Created Bar Charts in Tableau using data sets and added trend lines and forecasting on future forecasting, based on the predefined data set conditions by the business.
  • Created Crosstab’s to display underlying data based on various graphs and charts created for further data analysis.
  • Involved in creating reports based on Budgeting, Forecasting and planning.
  • Deployed packages from a development environment to test environment.
  • Involved in developing Proof of Concept for Big data technology implementation.
  • Performed data analysis on daily, weekly, and monthly scheduled refresh of data based on that business or system change to ensure that the published dashboards are displaying accurate and up-to-date information.

Environment: Informatica Power Center, Tableau, Hadoop, Cloudera, Oracle, Sybase, Delimiter files, UNIX Shell Script, Windows.

Confidential, Pittsburg, PA

Informatica Developer

Responsibilities:

  • Collaborated with Project Manager, Tech Lead, Developers, QA teams and Business SMEs to ensure delivered solutions optimally support the achievement of business outcomes.
  • Designed and developed ETL Processes based on business rules, job control mechanism using Informatica Power Center.
  • Data Warehouse Data modeling based on the client requirement using Erwin (Conceptual, Logical and Physical Data Modeling).
  • Worked extensively on complex mappings using source qualifier, joiner, expressions, aggregators, filters, Lookup, update strategy and stored procedure transformations.
  • Used workflow monitor to monitor the jobs, reviewed session/workflow logs that were generated for each session to resolve issues, used Informatica debugger to identify issues in mapping execution.
  • Re-engineered lots of existing mappings to support new/changing business requirements.
  • Used mapping variables, parameters, workflow variables & parameter files to support change data capture and automate workflow execution process.
  • Migrated data from Netezza to Oracle.
  • Involved in writing UNIX shell scripts (Pre/Post Session commands) for the Sessions & wrote shell scripts to kickoff workflows, unscheduled workflows, get status of workflows.
  • Performed Informatica administration like user, privileges, migrations, starting, stopping pmrep/pmserver. Backup and restore repository service.
  • Created partitions, SQL over ride in source qualifier, session partitions for improving performance.
  • Tuned SQL statements, Informatica mappings, used Informatica parallelism options to speed up data loading to meet defined SLA
  • Supported Informatica, non-informatica code migration between environments (DEV/QA/PRD)
  • Developed Oracle PL/SQL Packages, Procedures, Functions and Database Triggers.
  • Performed unit testing, system integration testing, and supported user acceptance testing.

Environment: Informatica, Oracle, IBM DB2, Netezza, MS SQL Server, Flat files, SQL * Loader, Unix scripting, PL/SQL, Data Modeling, Data Analysis, Dimensional Modeling.

Confidential, Chicago, IL

Informatica Developer

Responsibilities:

  • Worked closely with business analysts and gathered functional requirements. Designed technical design documents for ETL process.
  • Developed ETL mappings, transformations using Informatica PowerCenter 9.0.1/8.6.1.
  • Implemented Change Data Capture (CDC) process to load into the staging area.
  • Extensively used Informatica client tools Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, Transformation Developer, and Workflow Manager.
  • Extensively used ETL Tool Informatica to load data from Flat Files, Oracle, Sybase, and Excel.
  • Developed reusable Mapplets, Transformations and user defined functions.
  • Extensively used Mapping Debugger to handle the data errors in the mapping designer.
  • Experience using transformations such as Normalizer, Unconnected/Connected Lookups, Router, Aggregator, Joiner, Update Strategy, Union, Sorter, and reusable transformations.
  • Created event wait and event raise, email, command tasks in the workflows manager.
  • Responsible for tuning ETL procedures to optimize load and query Performance.
  • Good Data modeling experience using Dimensional Data modeling, Star Schema modeling, Snowflake modeling, and FACT and Dimensions tables.
  • Tested the mapplets and mappings as per Quality and Analysis standards before moving to production environment.
  • Involved in writing shell scripts for file transfers, file renaming and concatenating files.
  • Created debugging sessions for error identification by creating break points and monitoring the debug data values in the mapping designer.
  • Developed Unit test cases and Unit test plans to verify the data loading process.

Environment: Informatica - PowerCenter, Oracle, Sybase, Delimiter files, UNIX Shell Script, Windows XP, Toad for oracle, SQL.

We'd love your feedback!