We provide IT Staff Augmentation Services!

Sr Etl Informatica Developer Resume

4.00/5 (Submit Your Rating)

FL

SUMMARY

  • Around 7+ years of IT Experience in analysis, design, development, implementation, and troubleshooting of Data warehouse applications.
  • Demonstrated expertise in utilizing ETL tool Informatica PowerCenter 10.2/9.x/8.x, Informatica PowerExchange10.2/9.x/8.x and administrating the Data warehouse loads as per client requirement.
  • Expertise in building Enterprise Data Warehouses (EDW), Operational Data Store (ODS), Data Marts, Data Lake and Decision Support Systems (DSS) using Multidimensional and Dimensional modeling (Star and Snowflake schema) Concepts.
  • Create clear and concise epics/stories using JIRA as tracking tool.
  • Strong understanding of Data warehouse project development life cycle. Expertise in documenting all the phases of DWH projects.
  • Good experience in designing and developing audit, error identification and reconcile process to ensure the Data Quality of Data warehouse.
  • Involved in all phases of data warehouse project life cycle. Designed and developed ETL Architecture to load data from various sources like DB2 UDB, Oracle, Flat files, XML files, Sybase and MS SQL Server into Oracle, Teradata, XML, Sql Server, Hive, Impala and Azure targets.
  • Experience working with data lake implementation. Involved in development using Informatica to load data into Hive and impala systems.
  • 3+ year of Experience in using IDQ tool 10.0/9.6 for profiling, applying rules and develop mappings to move data from source to target systems.
  • Experience with Informatica Deployments using deployment groups and third - party deployment tools.
  • Good understanding on Hive SQL.
  • Building a proof-of-concept stand-alone scheduler using AngularJS. Assisting in the implementation of JIRA, logging epics, stories and assigning task.
  • Extensive knowledge in developing Teradata, Fast Export, Fast Load, Multi load and BTEQ scripts. Coded complex scripts and finely tuned the queries to enhance performance.
  • Profound knowledge about the architecture of the Teradata database.
  • Experience in writing PL/SQL, T-SQL procedures for processing business logic in the database. Tuning of SQL queries for better performance.
  • Experience in working with big data Hadoop stack tools like HDFS, HIVE, Pig, Sqoop
  • Strong experience in implementing CDC using Informatica PowerExchange 10.2/9.x/8.x Creating registration groups and extraction groups for CDC implementation.
  • Significant Multi-dimensional and Relational data modeling experience, Data Flow Diagrams, Process Models, ER diagrams with modeling tools like ERWIN & VISIO.
  • Extensive experience in developing mappings using various transformations like Source Qualifier, Expression, Lookup, Aggregator, Router, Rank, Filter and Sequence Generator transformations and various reusable components like Mapplets and Reusable transformations.

TECHNICAL SKILLS

ETL Tools: Informatica PowerCenter (10.2/9.x/8.x/7.x), Informatica PowerExchange (10.2/,9.x/8.x/7.x), (SQL Server 2014, 2012), IICS R29

Database: Oracle 12c/11g/10g, MS SQL Server (2016,2014,2012), Teradata 15/14, DB2-UDB, Informix, Netezza,Azure

BI/Reporting Tools: Tableau, Business Objects XI

Big Data: Hadoop Ecosystem (HDFS, Hive, HDFS)

Data Modeling Tools: ER/Studio, ERWIN, Power Designer

Languages: SQL, PL/SQL, T-SQL, UNIX Shell Scripts, VB Script

Operating Systems: UNIX/LINUX, AIX, Windows, MS Windows Server, MS DOS

Other Applications: Toad, SQL Developer, Visual Studio, MS Visio, Control M, Autosys, AWS, JIRA, QC

Protocol: ODBC, JDBC, OLE DB, TCP/IP

Server: MS SQL Server, LINUX Server

PROFESSIONAL EXPERIENCE

Confidential, FL

Sr ETL Informatica Developer

Responsibilities:

  • Requirement gathering, Business Analysis and documentation Functional, Technical, Integration Documents, low level and high-level design documents.
  • Involved in Requirement analysis in support of Data Warehousing efforts and Data Lake Implementation.
  • Extensively worked with various Active transformations like Filter, Sorter, Aggregator, Router, SQL, Union and Joiner transformations.
  • Worked on Migration from PowerCenter 9.6.1 to PowerCenter 10.2 version.
  • Worked on moving the data using Informatica BDE to the data lake using big data concepts.
  • Worked with source databases like Oracle, SQL Server, Teradata and Flat Files.
  • Database experience using Oracle, SQL Server, Azure SQL Server, SAP HANA, Teradata, DB2 and MS Access.
  • Extensively worked with Teradata utilities BTEQ, F-Load, M-load & TPT to load data into warehouse.
  • Responsible for investigation, characterization, and communication of build and release problems, implementing corrective and preventive actions. Resolved all the issues with JIRA tickets on priority basis.
  • Analyzed Change request (CR) as per requests from team track/JIRA. Creating SR to Informatica Inc- related to any Power center product issues.
  • Work independently on the data migration tasks, backup and restore the database, data comparison between databases, schema comparison, executing the migration scripts etc.
  • In order to increase the performance balanced the input files of slice count against large files and loaded into AWS-S3 Refine Bucket and by using copy command achieved the micro-batch load into the Amazon Redshift.
  • Created complex mappings using Unconnected and Connected Lookup Transformations using different caches.
  • Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata
  • Migration of informatica jobs to Hadoop sqoop jobs and load into oracle database
  • Implemented Slowly changing dimension Type 1 and Type 2 for change data capture.
  • Worked on loading data into Hive and Impala system for data lake implementation.
  • As per business we implemented Auditing and Balancing on the transactional sources so that every record read is either captured in the maintenance tables or wrote to Target tables.
  • Experience on Star Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling using Erwin.
  • Implemented Informatica push down optimization for utilizing the data base resources for better performance.
  • Extensively used the tasks like email task to deliver the generated reports to the mailboxes and command tasks to write post session and pre-session commands.

Environment: InformaticaPowerCenter 10.2, Informatica PowerExchange 10.2, Quality (IDQ) 10.0 Autosys, CDC, Oracle 12c, DB2, Azure, Sql Server2014, AWS, Hive, Hadoop, Impala, Teradata v15, Toad, Netezza, Unix.

Confidential, Minnetonka, MN

ETL Informatica Developer

Responsibilities:

  • Gathering requirements and implement them into source to target mappings.
  • Experience in integration of data sources like SQL server and MS access and non-relational sources like flat files into staging area.
  • Designing custom reports via SQL Reporting Services to align with requests from internal account teams and external Clients.
  • Designed and developed Technical and Business Data Quality rules in IDQ (Informatica Developer) and created the Score Card to present it to the Business users for a trending analysis (Informatica Analyst)
  • Effectively worked on Mapping Designer, Workflow Manager, and Workflow Monitor.
  • Used shortcuts for sources, targets, transformations, amplest, and sessions to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
  • Integrated Informatics Data Quality (IDQ) with Informatica Power center and Created POC data quality mappings in Informatics Data Quality tool and imported them into Informatica PowerCenter as Mappings, Mapplets
  • Applied slowly changing Dimensions Type I and Type II on business requirements.
  • Extensively worked on performance tuning and in isolating header and footer in single file.
  • Writing SQL queries to create end-user reports /Developing SQL Queries and stored procedures in support of ongoing work and application support.
  • Used Cognos Transformer to build multidimensional cubes
  • Project planning and scoping, facilitating meetings for project phases, deliverables, escalations and approval. Ensure adherence to SDLC and project plan.
  • Precisely documented mappings to ETL Technical Specification document for all stages for future reference.
  • Scheduled jobs for running daily, weekly and monthly loads through control-M for each workflow in a sequence with command and event tasks.
  • Used most of the transformations such as the Aggregators, Filters, Routers, Sequence Generator Update Strategy, Rank, Expression, lookups (connected and unconnected), Mapping Parameters, Session parameters, Mapping Variables and Session Variables.
  • Responsible for Creating workflows and Work lets. Created Session, Event, Command, Control, Decision and Email tasks in Workflow Manager.
  • Maintained the proper communication between other teams and client.

Environment: Informatics Power Center 9.5, Power Exchange 9.5, Erwin r7, Oracle 10g, SQL, PL/SQL, DB2 8.0, MS SQL Server 2008, Flat Files, Autopsy’s, Windows XP, UNIX, PL/SQL, SQL*Loader, TOAD, ANSI SQL

Confidential, San Francisco, CA

ETL Informatica Developer

Responsibilities:

  • Perform requirement analysis through coordination with business analysts and define business and functional specifications.
  • As a team conducted gap analysis and Discussions with subject matter experts to gather requirements, emphasize on problem areas and define deliverables.
  • Logical and Physical data modeling was done using Erwin for data warehouse database in STAR SCHEMA.
  • Automated/Scheduled the Informatica IICS (cloud) jobs to run daily with email notifications for any failures.
  • Convert specifications to programs and data mapping in an ETL Informatica IICS(Cloud) environment.
  • Using Informatica Power Center Designer analyzed the source data to Extract & Transform from various source systems (oracle 10g, DB2, SQL server and flat files) by incorporating business rules using different objects and functions that the tool supports.
  • Using Informatica Power Center created mappings and mapplets to transform the data according to the business rules.
  • Designed the ETL mappings between sources to operational staging targets, then to the data warehouse using Power center Designer.
  • Involved in the design of the star schema for the revenue management system data warehouse.
  • Extensively used Informatica Data Quality (IDQ) profiling capabilities to profile various sources, generate score cards, create and validate rules and provided data for business analysts for creating the rules.
  • Involved in Design, analysis, Implementation, Testing and support of ETL processes for Stage, ODS and Mart.
  • Documented Informatica mappings in Excel spread sheet.
  • Extensively used Autosys for Scheduling and monitoring.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 9.1 by using various transformations like Expression, Source Qualifier, Filter, Router, Sorter, Aggregator, Update Strategy, Connected and unconnected look up etc.
  • Extensively used various Performance tuning Techniques to improve the session performance.
  • Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data from multiple source systems like Oracle, SQL server and Flat files and loaded into Oracle.
  • Monitored workflows and session using Power Center workflows monitor
  • Used Informatica Scheduler for scheduling the workflows
  • Provide documentation of ETL design and technical specifications, high level and low-level design specifications.

Environment: Informatica PowerCenter 9.6.1, IDQ 9.6, Oracle 11g, IICS, R29SQL, PL/SQL, DB2 8.0, MS SQL Server 2008, T-SQL, Flat Files, Teradata 14, hive, Windows XP, UNIX, SAP BODS, PL/SQL

Confidential, New York

ETL Informatica Developer/ Data Analyst

Responsibilities:

  • Involved in creation of Logical Data Model for ETL mapping and the process flow diagrams.
  • Worked with SQL developer to write the SQL code for data manipulation.
  • Worked on Informatica versioned repository with check in and checkout objects feature.
  • Used Debugger extensively to validate the mappings and gain troubleshooting information about data and error conditions.
  • Provided guidance to less experienced personnel. Conducted quality assurance activities such as peer reviews.
  • Worked with production support systems that required immediate support.
  • Develop, execute and maintain appropriate ETL development best practices and procedures.
  • Assisted in the development of test plans for assigned projects.
  • Monitor and tune ETL processes for performance improvements; identify, research, and resolve data warehouse load issues.
  • Involved in unit testing of the mapping and SQL code.
  • Developed mappings to load data in slowly changing dimensions.
  • Involved in performance tuning of source & target, mappings, sessions and workflows.
  • Worked on Teradata various utilities like BTEQ, FLOAD and created procedures.
  • Worked with connected, unconnected lookups and reusable transformations and mapplets.
  • Utilized Unix Shell Scripts for adding the header to the flat file targets.
  • Involved in designing the star schema and populating the fact table and associated dimension tables.

Environment: Informatica 8.6, Oracle 10g, SQL Server, OBIEE, Teradata, Sybase, Windows XP, Visio 2000, Business objects XIR2, ESP, SCM, Putty, Win Scm, UNIX Windows.

We'd love your feedback!