We provide IT Staff Augmentation Services!

Etl/informatica Developer Resume

4.00/5 (Submit Your Rating)

Eagan-, MN

SUMMARY

  • Diverse experience in Information Technology industry with emphasis on Data Analysis, SQL Development, working on multiple domains such as Finance, Insurance, Sales & Marketing and Healthcare industry. An excellent facilitator between Business Clients and Information Technology personnel with effective use of industry’s best business practices. Quick learner possess excellent analytical and technical skills, and excellent team player.
  • 8+ years of experience in Requirements gathering, Gap analysis, Designing, Implementation, Coding, Testing, Production support and Resource Management in Data warehousing with business knowledge of Insurance and Telecom.
  • Good working Experience in Software Development Life Cycle (SDLC) methodologies like WaterfallandAgile.
  • Experienced in using ETL tools including Informatica PowerCenter 10.x/9.x/8.x,Power Mart, Power Exchange,Workflow Manager, Repository Manager, Administration consoleand Informatica Data Quality (IDQ).
  • Performed thedata profiling and analysis making use of Informatica Data Quality (IDQ).
  • Implemented Data Warehousing Methodologies using ETL/Informatica Designer, Repository Manager, Workflow Manager, Workflow Monitor, and Talend Open Studio Data Integration.
  • Used various Informatica Power Center and Data Quality transformations as - aggregator, source qualifier, update strategy, expression, joiner, lookup, router, sorter, filter, XML Parser, labeler, parser, address validator, match, merge, comparison and standardizer to perform various data loading and cleansing activities.
  • Worked on various applications using Python integrated IDEs Eclipse.
  • Extensively used Informaticadata masking transformation tomask NPI data (SSN, birth date and account number etc.) in Dev andQA environments.
  • Strong experience with Informatica tools usingreal-timeChange Data Capture (CDC)andMD5.
  • Experienced in using advanced concepts of Informatica likePush Down Optimization (PDO).
  • Designed and developed Informatica mappings includingType-I, Type-II and Type-III slowly changing dimensions (SCD).
  • Created mappings using different IDQ transformations like Parser, Standardizer, Match, Labeler and Address Validator.
  • Used Address validator transformation to validate source data with the reference data for standardizes addresses.
  • Experienced in Teradata SQL Programming.
  • Worked with Teradata utilities likeFast Load,Multi Load, Tpumpand Teradata Parallel transporter.
  • Experience in using Transformations, creatingInformatica Mappings, Mapplets, Sessions, Worklets, Workflowsand processing tasks using Informatica Designer / Workflow Manager.
  • Experienced in scheduling Informatica jobs using scheduling tools likeTidal, Autosysand Control- M.
  • Extensive experience in Netezza database design and workload management.
  • Experienced in writingSQL, PL/SQL programming, Stored Procedures, Functions, Triggers, ViewsandMaterialized Views.
  • Experienced on Scheduling the Talend and Informatica jobs using Autosys and Atomic Applications.
  • Good command on Database asOracle 11g/10g/9i/8i, Teradata 13, SQL Server2008,Netezza andMS Access 2003.
  • Data Modeling: Data modeling knowledge inDimensional Data modeling, Star Schema, Snow-Flake Schema, FACTandDimensions tables.
  • Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting.
  • Experience in resolving on-going maintenance issues and bug fixes, monitoring Informatica sessions as well as performance tuning of mappings and sessions.

TECHNICAL SKILLS

Data Warehousing /ETL: Informatica PowerCenter 10.x/9.x/8.x, Informatica Data Quality 10.X/9.x, SSIS, Power Exchange, Metadata Manager 9.6.1, Talend Data Integration, Data Mart, OLAP, OLTP and ERWIN 4.x/3. x.

DataModeling: Dimensional Data Modeling, Data Modeling, Star Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Facets, Physical and Logical Data Modeling, Erwin and Oracle Designer.

Databases &Tools: Oracle11g/10g/9i/8i, Teradata, DB2 UDB 8.5,SQL Server 2012/2008/2005 , Netezza, Golden Gate, SQL*Plus, SQL*Loader andTOAD.

Scheduling Tools: Autosys, Tidal and Control-M.

Reporting Tools: OBIEE, Tableau, Business Objects XI/6.5/6.0andCognos Series 7.0.

Programming Languages: Unix Shell Scripting,SQL and PL/SQL Java, XML, XSD, Python Scripting, C#.

Methodology: Agile, Ruby, SCRUM and Waterfall.

Environment: Windows, UNIX and LINUX.

PROFESSIONAL EXPERIENCE

Confidential, Eagan- MN

ETL/Informatica Developer

Responsibilities:

  • Work with business to analyze Data quality, Data mismatch, and Reconciliation issues and fix Informatica ETL jobs to address data mismatch.
  • Involved in Updating the Technical Design Specifications whenever any changes or updates need to be done.
  • Analyze large volume of data sets, perform data cleansing, transformation and migrating data to respective data marts.
  • Design and creation of Informatica Mappings or Mapping task (IICS), Informatica Power Center Mapping, Informatica Power Exchange CDC Mappings, Informatica MDM/RDM mappings, tasks through Informatica Developer, Teradata Scripts, SQL Queries and UNIX shell scripts.
  • Built cloud solutions using the IICS to integrate other cloud services into one platform of cloud data warehouse.
  • Performed POC’s on converting existing Informatica Jobs to Informatica Cloud Jobs using IICS.
  • Worked on informatica integration cloud software( IICS) Application Integration components like Processes, Service Connectors, and Process Objects to integrate into another application and with CIH,CDI,CAI.
  • Experience in IICS Application Integration components like Processes, Service Connectors, and Process
  • Object
  • Design and Development of Informatica ETL Jobs, Mappings, Workflows and ETL mapping document.
  • Create advanced PL/SQL programs and SQL scripts to develop new business requirements and application enhancements.
  • Created Talend jobs to copy the files from one server to another and utilized Talend FTP components
  • Created and Configured Workflows, Worklets, and Sessions to transport the data to target warehouse Netezza tables using Informatica Workflow Manager.
  • Worked with NZLoad to load flat file data into Netezza tables.
  • Extensively involved in writing triggers, cursors, procedures, functions, loops in Oracle 12c for various purposes and optimizing the SQL to improve performance.
  • Experience in working on Cloud products integration tools and cloud data bases like Informatica Intelligent Cloud Services, AWS Redshift, Salesforce.
  • Developed Unix scripts for source file validations to eliminate hidden special characters, File Acquisitions, also ftp to download source files from remote servers.
  • Used ETL methodologies and best practices to create Talend ETL jobs.
  • Writing UNIX shell scripts to automate Informatica ETL Workflows, Flat file processing, and to schedule batch jobs.
  • Designed Database deployment automation through Continuous Integration.
  • Creating, Scheduling and deploying Jobs against each workflow in CA WorkStation (ESP).
  • Using the various ETL Tool debuggers to troubleshoot the performance issues and resolve the bottlenecks in the mappings
  • Talend Administration Console (TAC) for scheduling jobs and adding users.
  • Writing the Linux automation testing scripts and run them on the databases to retrieve and compare the data between source and target databases.
  • Extensively used git as external repository and created Urban Code Deployment (UCD) scripts for auto deployment of SQL, Unix Scripts in servers.
  • Provide 24/7 application support for any job failures in production, immediate code fixes within SLA and resume the flows.
  • Working on P1/P2 Incidents for data related issues, Identifying the bottle necks missing data from tables, provide immediate fix and reload missing data to the target table.

Environment: Informatica Power Center, IICS, SnowFlake, Netezza, Oracle PL/SQL, Talend Data Integration, Talend Administrator Console, Unix Platforms, Oracle Forms, CA Workstation (ESP), Git Hub, Python, DBEaver, Toad, Shell Scripting.

Confidential, Bloomington MN

ETL/Informatica Developer

Responsibilities:

  • Worked with Informatica Data Quality 10.0 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 10.0
  • Extensively Working on Informatica tools such as Source Analyzer, Workflow Manager, Transformation Designer, Mapplet Designer and Mapping Designer
  • Using Informatica Power center 10.1 to make the changes to the existing ETL mappings in each of the environments.
  • Wrote PL/SQL procedures which are called from Stored Procedure transformation to perform database actions such as truncate the target before load delete records based on a condition and rename the tables.
  • Create ETL mappings with complex business logic for high volume data loads using various transformations such as Aggregator, Sorter, Filter, Normalizer, SQL Transformation, Lookup, Joiner, Router, Update Strategy, Union, Sequence Generator, transformation language likes transformation expression, constants, system variables, data format strings etc.
  • Used ETL methodologies and best practices to create Talend ETL jobs. Followed and enhanced programming and naming standards.
  • Developed Mapplets, Reusable Transformations, Source and Target definitions, mappings using Informatica 10.0.
  • Extensively worked on building workflows, worklets, sessions, command tasks etc.
  • Extensively worked on ETL performance tuning for tune the data load, worked with DBAs for SQL query tuning etc.
  • Proficient in implementing all transformations available on the SSIS toolbox.
  • Using Source Qualifier & Lookup SQL overrides, Persistent caches, Incremental Aggregation, Partitions, and Parameter Files etc. for better performance.
  • Created sessions to use pre & post session's stored procedures to truncate the tables, drop & create indexes on tables for bulk loading.
  • Creating mapping from Source to Target in Talend.
  • Working on change requests to the existing processes in production
  • Used Event wait Task, Control Task, Email Tasks, Command Tasks, Link conditions in workflows.
  • Bulk load the database tables from flat files using Oracle SQL Loader
  • Working on creating & modifying the Unix shell scripts using SSIS packages, designing and building data marts and cubes with SSAS and MDX for fast target data retrieval
  • Provide test data in QA/UAT for the downstream application testing
  • Extensively worked on the Repository Manager to create/modify/delete users/group/roles
  • Involved in creating labels and migrating ETL code between different environments.
  • Extensively used PL/SQL, TOAD programming in backend and front-end functions, procedures, packages to implement business rules
  • Integrated various sources into the Staging area in Data warehouse
  • Provided production support to schedule and execute production batch jobs and analyzed log files in Informatica 9.5/& 9.6.1 Integration servers
  • Developed shell scripts to automate the data loading process and to cleanse the flat file inputs

Environment: Informatica Power Center 10.1, IDQ 10.0, Informatica MDM 10.0, Talend Data Integration, Talend Administrator Console, Oracle Database 11g, SQL server, Toad for Oracle, Unix Shell scripts, Teradata.

Confidential, Eden Prairie-MN

Informatica/IDQ Developer

Responsibilities:

  • Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
  • Designed table structure inNetezza.
  • Designed various mappings and Mapplets using different Transformations Techniques such asKey Generator, Match, Labeler, Case Converter, StandardizerandAddress Validator.
  • Developednew mapping designs using various tools in Informaticalike Source Analyzer, Warehouse Designer, Mapplet Designer and Mapping Designer.
  • Developed the mappings using transformations in Informaticaaccording to technical specifications.
  • Created complex mappings that involved implementation of Business Logic to load data in to staging area.
  • Wrote Python scripts to parse XML documents and load the data in database.
  • Implement Data Quality Rules using Informatica Data Quality (IDQ) to check correctness of the source files and perform the data cleansing/enrichment.
  • Involved in massivedata profilingusing IDQ (Analyst tool) prior to data staging.
  • Created profiles and score cards for the users using IDQ.
  • Built several reusable components on IDQ using Parsers Standardizers and Reference tables.
  • Developed Merge jobs in Python in order to extract and load data into MySQL database and used Test driven approach for developing applications.
  • Used Informatica reusability at various levels of development.
  • Involved in Database migrations from legacy systems, SQL server to Oracle andNetezza.
  • Developed mappings/sessions usingInformatica Power Center 9.6.1for data loading.
  • Performed data manipulations using various Informatica Transformations likeFilter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and Union.
  • Created mappings, Mapplets according to Business requirement using Informatica big data version and deployed them as applications and exported to power center for scheduling.
  • Developed Workflows using taskdeveloper,Worklet designer and workflow designer in Workflow manager and monitored the results using workflow monitor.
  • Building Reports according to user Requirement.
  • Experienced in loading data betweenNetezzatables using NZSQL utility.
  • Extracted data from Oracle and SQL Server then usedTeradatafor data warehousing.
  • Implemented slowly changing dimension methodology for accessing the full history of accounts.
  • WrittenUNIX shell scriptsto load data from flat files toNetezzadatabase.
  • Scheduling Informatica jobs and implementing dependencies if necessary, usingAutosys.
  • Optimizing performance tuning at source, target, mapping and session level.
  • Participated in weekly status meetings, and conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings.

Environment: Informatica Power Center 9.6.1,IDQ9.6.1,Oracle 11g/10g, Teradata V2R5, Vertica, TOAD for Oracle, SQL Server 2008, PL/SQL, DB2,Netezza, Golden Gate, Agile, Ruby, Python, SQL, Erwin 4.5, Business Objects,Unix Shell Scripting (PERL),UNIX (AIX),Windows XP andAutosys.

Confidential

Data Analyst

Responsibilities:

  • Performed requirements gathering, developed Analysis & Design document (A&D), developed Project time line,
  • Designed and developed the ETL Mappings for the source systems data extractions, data transformations, data staging, movement and aggregation,
  • Developed standard mappings using various transformations like expression, aggregator, joiner, source qualifier, router, lookup, and filter for weekly, quarterly process to loading heterogeneous data into the data warehouse. Source files include delimited flat files, and SQL server tables,
  • Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression, Lookup, Update strategy, Union, SQL, Java and Sequence generator,
  • Executed sessions, both sequential and concurrent, for efficient execution of mappings and used other tasks like event wait, event raise, email and command,
  • Used Informatica for loading the historical data from various tables for different departments
  • Developed Informatica mappings for Slowly Changing Dimensions Type 1 and 2,
  • Created Mapplets for implementing the incremental logic, in identifying the fiscal quarter of the transaction and in calculating various requirements of business processes,
  • Involved in the Unit & Integration testing of the mappings and workflows developed for the enhancements in the application,
  • Code migration of Informatica jobs from Development to Test to Production. Performed Unit testing, Integration Testing, Job & Environment Parameters Testing along the way,
  • Scheduled and ran Extraction and Load processes and monitored tasks and workflows,
  • Tuned the MMW databases (stage, target) for better, faster and efficient loading and user query performance,
  • Created Informatica mappings with PL/SQL stored procedures/functions to in corporate critical business functionality to load data,
  • Extensively worked on performance tuning of mappings, sessions, Target, Source and various ETL processes.

Environment: Informatica PowerCenter, Informaticacloud,Netezza,Oracle10g, MS-SQL Server, PL/SQLDeveloper,Bourne shell, TOAD, MS Office and Delimited Flat files, Control-M

We'd love your feedback!