We provide IT Staff Augmentation Services!

Etl (informatica) Developer Resume

5.00/5 (Submit Your Rating)

Chicago, IL

SUMMARY

  • Around 6+ years of experience in the development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, Tableau, OLAP, BI, Client/Server applications with Teradata.
  • Extensively worked in ETL and data integration in developing ETL mappings and scripts.
  • Hands on Data Warehousing clustered environment ETL experience of using Informatica 9.6.1/9.5/8.6.1 Power Center Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools - Informatica Server, Repository Server manager.
  • Expertise in Data Warehouse/Data mart, File feeds, ODS, OLTP and OLAP implementations teamed with project scope, data modeling, ETL development, System testing, Implementation and production support.
  • Used transformations like Joiner, Expression, Connected and Unconnected lookups, Filter, Aggregator, Store Procedure, Rank, Update Strategy, Router and Sequence generator.
  • Designed ETL process using Informatica Tool to load from Sources to Targets through data Transformations.
  • Knowledge on data cleansing in Data quality by using the transforms Match transform, Address cleanse and Data cleanse in BODS.
  • Experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using ERwin and ER-Studio.
  • Expertise in working with relational databases such as Oracle 11g/10g, SQL Server 2008/2005 and Teradata.
  • Excellent skills on Oracle, Netezza, Teradata, SQL Server, and DB2 database architecture.
  • Worked on lambda architecture for Real-Time Streaming and Batch processing of weblogs.
  • Hands on experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, Oracle SQL and Oracle PL/SQL.
  • Expert in building Data Integration, Data Visualization, Workflow solutions, and ETL solutions for clustered data warehouse using SQL Server Integration Services (SSIS).
  • Experience as Business Intelligence Developer using Microsoft BI framework (SQL server, SAS and R, SSIS, SSAS, SSRS) in various business domains including Finance, Insurance, and Information Technology.
  • Experience in writing expressions in SSRS and Expert in fine tuning the reports. Created many Drill through and Drill Down reports using SSRS.
  • Experience using Visio and Erwin design tools like Deployment Processes and Adhoc to create Star and Snowflake schemas.
  • Build Splunk dashboards using XML and Advanced XML as well as Created Scheduled Alerts for Application Teams for Real Time Monitoring.
  • Experience in Big Data Analysis, set frequent Mining and Association Rule Mining.
  • Experience procedures, functions in PL/SQL, troubleshooting and performance tuning of PL/SQL scripts.
  • Experience in Python and UNIX shell scripting for processing large volumes of data from varied sources and loading into Vertica.
  • Experience in creating profiles using Informatica Data Quality Developer and analyst tool.
  • Partitioned large Tables using range partition technique.
  • Experience with Oracle Supplied Packages such as DBMS SQL, DBMS JOB and UTL FILE.
  • Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.
  • Knowledge on Informatica Data Quality, Informatica Analyst, Informatica MDM tool, Informatica Developer Big Data Edition, Informatica B2B Data Transformation etc.
  • Data Base Testing(ETL), Report Testing, Functionality, E2E and Regression.
  • Experience in all phases of Data warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing and Documenting.
  • Proficient in the Integration of various data sources with multiple relational databases like Oracle11g / Oracle10g, MS SQL Server, DB2, Teradata, VSAM files and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
  • Worked with Netezza harmaceutical database to implement data cleanup, performance-tuning techniques.
  • Experience in using Automation Scheduling tools like Autosys and Control-M.
  • Worked extensively with slowly changing dimensions in EDW environment.
  • Experience in Informatica B2B Data Exchange using Unstructured, Structured Data sets.
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, system integration and user acceptance testing.
  • Experience in UNIX shell scripting, FTP, Change Management process and EFT file management in various UNIX environments.
  • Highly motivated to take independent responsibility as well as ability to contribute and be a productive team member with excellent Verbal and Communication Skills and clear understanding of Business procedures.

TECHNICAL SKILLS

Databases: Oracle 7.x/8i/9i/10g/11g (SQL, PL/SQL, Stored Procedures, Triggers), MS SQL SERVER 2000/2005/2008 , DB2/UDB, Teradata, SAP Tables and MS Access.

ETL Tools: Informatica Power Center 9.6.1/9.5/8.6.1 (Designer, Workflow Manager, Workflow Monitor, Repository manager and SSIS(integration service) Informatica Server), Power Exchange CDC, Ab-Initio 1.8.

Data Modeling tools: Erwin 4.0, MS Visio.

Languages/Utilities: SQL, JDBC, PL/SQL, Python, UNIX, Shell scripts, SOAP UI, Perl,Web Services, Java Script, HTML,XML/XSD, Eclipse,C

IDE/Tools: Putty, Toad, SQL Developer, SQL Loader, HP Quality center

Operating Systems: UNIX (Sun Solaris, LINUX, HP UNIX, AIX), Windows NT, Windows XP, Windows 7, 8, 10.

Scheduling Tools: Tidal, AutoSys11, UC 4.

Testing Tools: QTP, WinRunner, LoadRunner,, Unit test, System test, Quality Center, Test Director, Clear test, Clear case.

PROFESSIONAL EXPERIENCE

Confidential, Chicago, IL

ETL (Informatica) Developer

Responsibilities:

  • Interacting with business owners to gather both functional and technical requirements.
  • Understanding and reviewing the functional requirements which, we get from different states with the Business Analyst and signing off the requirement document.
  • Prepared technical design document as per the functional specification and unit test cases.
  • Developed and tested Informatica mappings based on the specification.
  • Used various transformations to extract data from different formatted files and relational source system.
  • Design and develop PL/SQL packages, stored procedure, tables, views, indexes, and functions; implement best practices to maintain optimal performance.
  • Design, develop, and test Informatica mappings, workflows, worklets, reusable objects, SQL queries, and Shell scripts to implement complex business rules.
  • Developed reusable transformations and mapplets to offset redundant coding and reduced development time and improving loading performance both at mapping and at session level.
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager .
  • Broad experience in use of Informatica Data Quality (IDQ) for Initial Data profiling and matching, removing duplicate data.
  • Used Informatica B2B Data Exchange to Structured data like XML.
  • Performed/automated many ETL related tasks including data cleansing, conversion, and transformations to load Oracle 10G based Data Warehouse.
  • Analysed the data based on requirements and wrote down techno functional documents and developed complex mappings using Informatica data quality (IDQ).
  • Created tables in staging to reduce code change in mapping to handle dynamic field positions in the source data files and generating flat file.
  • Designed and developed strategy for the Workflows and set dependencies between the workflows.
  • Extensive work in SSRS, SSAS, SSIS, MS SQL Server, SQL Programming and MS Access.
  • Worked extensively on Erwin and ER Studio in several projects in both OLAP and OLTP applications.
  • User Acceptance, E2E, Multiple Browser, Regression, and Smoke Testing.
  • Validated, debugged old Mappings tested Workflows & Sessions and figured out the better technical solutions. Identified the Bottlenecks in old/new Mappings and tuned them for better Performance.
  • Designed Mappings using B2B Data Transformation Studio.
  • Parameters Debugging Parameter Issues Matrix Reports and Charts.
  • Expert in creating parameterized reports, Drill down, Drill through, Sub reports, linked reports, Snapshot, Cached, Adhoc reports using SSRS.
  • Understanding ETL requirement specifications to develop HLD & LLD for type-1, SCD Type-II and Type-III mappings and was involved in testing for various data/reports.
  • Extensively worked with performance tuning at mapping levels like implementing active transformation like filter as early as possible in the mapping. Worked extensively with Update Strategy transformation for implementing inserts and updates.
  • Investigating software bugs and reporting to the developers using Quality Center Defect Module.
  • Worked on Informatica Data Quality developer modules like key generator, parser, standardizer, address validator match and consolidation.
  • Wrote PL/SQL stored procedures to manipulate the data.
  • Designed and developed the logic for handling slowly changing dimension table’s load by flagging the record using update strategy for populating the desired.
  • Implemented slowly changing dimension (SCD) Type 1 and Type 2 mappings for changed data capture.
  • Helped with data profiling, specifying and validating rules (Scorecards), and monitoring data quality using the Informatica Analyst tool.
  • UAT testing for HIPAA 4010 and 5010 projects including legacy testing and HIPAA requirements and compliance mandates.
  • Extensively worked on Customer Improvement Plan (CIP) items by adding new planning areas to EDW Goal State.
  • Created test cases and assisted in UAT testing .
  • Reviewed Informatica mappings and system test cases before delivering to Client
  • Developer Shell/Perl , MFT scripts to transfer files using FTP, SFTP, and to automate ETL jobs
  • Created UNIX shell scripts for archive and purge source files in weblogs.
  • Re-designed multiple existing Power Center mappings to implement change requests (CR) representing the updated business logic.
  • Migrated Informatica mappings, sessions and workflows from development environment to QA, and checking the developed code into Tortoise SVN for release Exception management.
  • Maintained all phases of support documents like operation manual, application flows.
  • Documented Data Mappings/ Transformations as per B2B the business requirement.
  • Transferred knowledge to outsource team prior to my project completion.

Environment: Informatica Power Center 9.1.0/9.6.1 , Oracle 11g/10g RAC, ESP, Putty, Erwin, XML Files, CSV files, SQL, PL/SQL, Linux, Unix Shell scripting, Netezza, Ab Initio Data Profiler, Windows 7, SSIS/SSRS, Informatica Cloud Toad3.0, Aginity, Cognos, BO BI4.0.

Confidential

ETL Informatica Developer

Responsibilities:

  • Interacted with Data Modelers and Business Analysts to understand the requirements and the impact of the ETL on the business.
  • Designed ETL specification documents for all the projects.
  • Created Tables, Keys (Unique and Primary) and Indexes in the SQL server.
  • Extracted data from Flat files, DB2, SQL and Oracle to build an Operation Data Source. Applied business logic to load the data into Global Data Warehouse.
  • Extensively worked on Facts and Slowly Changing Dimension (SCD) tables.
  • Maintained source and target mappings, transformation logic and processes to reflect the changing business environment over time.
  • Used various transformations like Filter, Router, Expression, Lookup (connected and unconnected), Aggregator, Sequence Generator, Update Strategy, Joiner, Normalizer, Sorter and Union to develop robust mappings in the Informatica Designer.
  • Extensively used the Add Currently Processed Flat File Name port to load the flat file name and to load contract number coming from flat file name into Target.
  • Integrated Informatica Data Quality (IDQ) with Informatica PowerCenter and Created POC data quality mappings in Informatica Data Quality tool and imported them into Informatica Powercenter as Mappings, Mapplets.
  • Worked on complex Source Qualifier queries, Pre and Post SQL queries in the Target.
  • Worked on different tasks in Workflow Manager like Sessions, Events raise, Event wait, Decision, E-mail, Command, Worklets, Assignment, Timer and Scheduling of the workflow.
  • Created Splunk Dashboards for Business and system performance monitoring.
  • Experience in profiling vertica queries and using utilities like DBD, admin tools and workload analyzer.
  • Created NZ Load Scripts to load the Flat Files into Netezza Staging tables.
  • Configurator of related Tidal scheduler jobs and performed unit, load and partner testing of Confidential .
  • Performed Extensive Data Quality checks using Ab InitioData Profiling Tool.
  • Wrote programs in SAS and R to generate reports, creating RTF, HTML listings, tables and reports using SAS/ODS for Ad-Hoc report generation.
  • Used UNIX and shell scripting extensively to enhance the PERL scripts and develop, schedule and support Control M batch jobs to schedule the data generation and reporting. The PERL and SHELL scripts invoke the stored procedures for data load, computation and generation of reports.
  • Architect, Design and develop Analytical dashboards for Cost analysis and ITIL effective management using Tableau.
  • Extensively used E2E workflow variables, mapping parameters and mapping variables.
  • Created sessions, batches for incremental load into staging tables and scheduled them to run daily.
  • Used shortcuts to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
  • Experience in DW concepts and technologies using Vertica application.
  • Develop complex reusable formula reports and reports with advanced features such as conditional formatting, built-in/custom functions usage, multiple grouping reports in Tableau.
  • Implemented Informatica recommendations, methodologies and best practices.
  • Implemented performance tuning logic on Targets, Sources, Mappings and Sessions to provide maximum efficiency and performance.
  • Involved in Unit, Integration, System, and Performance testing levels.
  • Written documentation to describe program development, logic, coding, testing, changes and corrections.

Environment: Informatica Power Center 8.6.1, Oracle 10g, ERL, Alteryx, Axon, SQLServer2008, IBM ISeries (DB2), MS Access, Unix, Windows XP, No-Sql, subversion SVN, Ab Initio Data Profiler, Toad, Cognos 8.4.1., SQL developer.

Confidential

ETL/Informatica Developer

Responsibilities:

  • Interact with business analysts, Analyzed, inspected and translate business requirements into technical specifications.
  • Participated in system analysis and data modeling, which included creating tables, views, triggers, functions, indexes, functions, procedures, cursors.
  • Involved Creating Fact and Dimension tables using Star schema.
  • Extensively involved working on the transformations like Source Qualifier, Filter, Joiner, Aggregator, Expression and Lookup.
  • Created session logs, workflow logs and debugger to debug the session and analyze the problem associated with the mappings and generic scripts.
  • Design and developed complex Informatica mappings including SCD Type 2 (Slow Changing Dimension Type 2).
  • Written several complex SQL queries to validate the Data Transformation Rules for ETL testing
  • Extensively worked in Workflow Manager, Workflow Monitor and Work let Designer to create, edit and run workflows.
  • Involved in the design, development and testing of the PL/SQL stored procedures, packages for the ETL processes.
  • Developed UNIX Shell scripts to automate repetitive database processes and maintained shell scripts for data conversion.
  • Extensively used Various Data Cleansing and Data Conversion Functions in various transformations.

Environment: Informatica Power Center 9.1/8.6, SQL server 08, Oracle 11g/10g, PL/SQL, SAP BO, SSRS, Windows NT, Flat files (fixed width/delimited), MS-Excel, UNIX shell scripting, Putty, WinScp.

We'd love your feedback!