We provide IT Staff Augmentation Services!

Informatics Etl Developer Resume

2.00/5 (Submit Your Rating)

Cary, NC

SUMMARY

  • Experienced in designing and developing efficient ETL solutions to load huge volumes of data to/from various Flat files, XML files and Relational Databases i.e. Teradata, Oracle, SQL Server & Netezza.
  • Proficient in gathering the Business requirements and translating them to corresponding technical requirements and strategies.
  • Extensive working experience in the Analysis, Design, Development, Testing and Implementation phases of various Data Warehousing applications. Worked in both Waterfall & Agile methodologies including SCRUM.
  • Sound knowledge in Data warehousing concepts like OLTP vs OLAP, Dimensional Data Modelling, E - R Modelling, Slowly Changing Dimensions, Database Normalization/De-normalization etc.
  • Developed many complex mappings using various transformations i.e. Expression, Filter, Look-up, Joiner, Router, Aggregator, Stored Procedure, Normalizer, Transaction Control, Custom, HTTP, XML transformations etc.
  • Possessing good knowledge in ETL Informatica Architecture about Nodes, Domain & Services. Also, working experience with Admin Console.
  • Well-versed with ETL Informatica performance tuning process involving bottleneck identifications, analyzing thread statistics, optimizing components and using parallel partitions.
  • Good knowledge of Teradata Architecture and proficient in creating various database objects Tables, Indexes, Views, Triggers, Macros, Stored Procedures etc.
  • Hands-on experience with Teradata utilities such as BTEQ, FastExport, FastLoad, MultiLoad, TPT & TPump.
  • Performed tuning and optimization of complexSQL queriesusingTeradataExplain.
  • Extensive experience in writing complex Oracle PL/SQL Packages, Stored procedures, Functions, Cursors and Triggers. Good knowledge of key Oracle performance related features such as Stats Gathering, Explain Plans, Hints etc.
  • Expertise in T-SQL and decent knowledge of MS SQL Server Suite applications i.e. SSIS & SSRS.
  • Hands-on experience with Data Modelling tool ERwin to design the Logical and Physical models for the OLTP and OLAP applications.
  • Proficient in writing efficient Unix Shell scripts. Developed various shell scripts to automate manual tasks & generate batch status reports.
  • Experienced in scheduling tools Autosys & Control-M to create/modify batch jobs, enable alerts and set up the calendars.
  • Designed & developed various macros using Microsoft Excel VBA to validate, import and export the Informatica objects to/from the repositories.
  • Having exposure to the Reporting Tools Business Objects, Tableau & SSRS and Financial Market compliant tools Mantas & Actimize.
  • Good understanding of the various Financial services like Investment Banking, Wealth Management, Asset Management, Private Banking, Capital Markets etc.
  • Worked withInformatica Data Quality (IDQ)9.6.1 for data cleansing, data matching and data conversion
  • Designed/DevelopedIDQ reusable mappingsto match accounting data based on demographic information.
  • Worked with IDQ on data quality for data cleansing, robust data, remove the unwanted data, correctness of data.
  • Workedon IDQ parsing, IDQ Standardization, matching, IDQ web services.
  • Importedthe mappings developed in data quality (IDQ)to Informatica designer.
  • Proficient in debugging and trouble-shooting of any technical and performance issues.
  • Excellent written and verbal communication skills.
  • Team player with good problem-solving and analytical skills. Enthusiastic to learn new tools and technologies.
  • Ability to multi-task and lead a team in a fast-paced environment.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 9.x/8.x/7.x, SQL Server Integration Service (SSIS)

DQ Tools: Informatica Data Quality 9.x

Databases: Teradata 12/13/14, Oracle 11g/10g/9i/8i SQL Server 2012/2008/2005 , Netezza 7.x

Database Tools: Teradata SQL Assistant, SQL Developer, TOAD, SSMS

Reporting Tools: Business Objects XI R2, Tableau 7/8, SSRS

Data Modelling Tools: ERwin 7.x

Languages: C, C++, Java, C#, Perl, HTML, XML, SQL, T-SQL, PL/SQL, Shell Scripting, Excel VBA

Operating Systems: Windows 7/Vista/XP/2003/2000/NT, UNIX, AIX, Red Hat Linux & Sun Solaris

PROFESSIONAL EXPERIENCE

Confidential - Cary, NC

Informatics ETL Developer

Responsibilities:

  • Created various Informatica mappings to validate the transactional data against Business rules, extract look up values and enrich the data as per the mapping documents.
  • Developed various Informatica Workflows to load the data from various upstream systems using different methodologies i.e. trigger based pull, direct pull & file based push.
  • Designed the ETL architecture for the Deposits product to process huge volumes of Deposits data on daily basis.
  • Fine-tuned several long running Informatica workflows and implemented various techniques for the faster processing of high volume data by creating parallel partitions and using Teradata Fast Export and Netezza Bulk Writer.
  • Developed various SQL queries using joins, sub-queries & analytic functions to pull the data from various relational DBs i.e. Oracle, Teradata & SQL Server.
  • Created complex Datamart views for the corresponding products.
  • Created various complex PL/SQL stored procedures to manipulate/reconcile the data and generate the dashboard reports.
  • Performed Unit Testing & prepared the deployment plan for the various objects by analyzing the inter dependencies.
  • Developed several UNIX shell scripts for the files Archival & Compression.
  • Created various Autosys jobs for the scheduling of the underlying ETL flows.
  • Co-ordinated with various team members across the globe i.e. Application teams, Business Analysts, Users, DBA and Infrastructure team to resolve any technical and functional issues in UAT and PROD.
  • Created various technical documents required for the knowledge transition of the application which includes re-usable objects (Informatica & Unix).
  • Worked on IDQ for data cleansing, data matching, data conversion and address standardization.
  • Involved in integrating the change in the workflow to both test and allow error handling using Informatica IDQ
  • Created Data objects, Quick Profiles,Custom Profiles and Drill Down on Profile Result using IDQ.
  • Created reference tables from profile columns using IDQ.
  • Handling all Hadoop environment builds, including design, capacity planning, cluster setup, performance tuning and ongoing monitoring
  • Loading data from large data files into Hive tables.
  • Importing and exporting data into HDFS and Hive using Sqoop.

Environment: Informatica Power Center 9.6.1, IDQ, Oracle 11g, SQL Server 2012, MS Access 2010, SQL*Loader, UNIX, Winscp, Putty, Erwin 7.2, SQL, PL/SQL

Confidential, Bluffton, SC

Informatica ETL Developer

Responsibilities:

  • Involved in gathering and reviewing business requirements. Involved in designing the specifications, Design Documents, Data modeling and design of data warehouse.
  • Responsible for definition, development and testing of processes/programs necessary to extract data from operational databases, Transform and cleanse data, and Load it into data warehouse using Informatica Power center.
  • Created the repository manager, users, user groups and their access profiles.
  • Created complex mappings in Power Center Designer using Expression, Filter, Sequence Generator, Update Strategy, Joiner and Stored procedure transformations.
  • Created connected and unconnected Lookup transformations to look up the data from the source and target tables.
  • Implemented CDC for mappings so as to capture the changes and preserve history.
  • Modeling and populating the business rules using mappings into the Repository for Meta Data management.
  • Extensively worked on migrations and Conversions with legacy systems data in SQR Production Reporting.
  • Demonstrated expertise utilizing ETL tools, including Informatica and ETL package design, and RDBMS systems like SQL Server, Oracle.
  • Cleansed the source data, extracted and transformed data with business rules, and built reusable mappings, known as ‘Mapplets’ using Informatica Designer.
  • Extensively performed Data Masking for preserving the referential integrity of the user data.
  • Involved in the development of Informatica mappings and also performed tuning for better performance.
  • Extensively worked on tuning (Both Database and Informatica side) and thereby improving the load time.
  • Developed ETL’s for Data Extraction, Data Mapping and data Conversion using SQL, PL/SQL and various ETL scripts.
  • Automated the entire processes using UNIX shell scripts.
  • Conduct status meetings with project managers, escalate issues when necessary, conducts meetings for issues resolution.
  • Execute test scripts including pre-requisites, detailed instructions and anticipated results
  • Execute back end data-driven test cases
  • Performing RCA (Root Cause Analysis) of the failed test cases and provide the related data samples to client.
  • Responsible for reviewing of test plans for team members to ensure high quality deliverables
  • In corporate UAT test cases into testing
  • Identify the bugs/issues/problem and provide solutions for potential issues based on the knowledge acquired on the application.
  • Worked with IDQ on data quality for data cleansing, robust data, remove the unwanted data, correctness of data.
  • Workedon IDQ parsing, IDQ Standardization, matching, IDQ web services.
  • Importedthe mappings developed in data quality (IDQ)to Informatica designer.

Environment: Informatica Power Center 8.6.1/8.1.3 , IDQ, Oracle 10g, SQR, OBIEE, ODI, ETL, SQL Assistant and Administrator, XML, UNIX Shell Scripting.

Confidential, Chantilly, VA

Informatica ETL Developer

Responsibilities:

  • Designed and Developed mappings using different transformations like Source Qualifier, Expression, Lookup (Connected & Unconnected), Aggregator, Router, Rank, Filter and Sequence Generator.
  • Created Update Strategy and Stored Procedure transformations to populate targets based on business requirements.
  • Responsible for definition, development and testing of processes/programs necessary to extract data from operational databases, Transform and cleanse data, and Load it into data warehouse using Informatica Power center.
  • Implemented CDC for mappings so as to capture the changes and preserve history.
  • Extensively used PL/SQL programming procedures, packages to implement business rules.
  • Created and Configured Workflows, Worklets and Sessions to transport the data to target warehouse tables using Workflow Manager.
  • Optimized the performance of the mappings by various tests on sources, targets and transformations. Developed Procedures and Functions in PL/SQL for ETL.
  • Extensively used ETL to load data from source systems like Flat Files and Excel Files into staging tables and load the data into the target database Oracle.
  • Extensively dealt with the performance issues and made necessary coding changes for improving the system Performance.
  • Converted the data mart from Logical design to Physical design, defined data types, Constraints, Indexes, generated Schema in the Database, created Automated scripts, defined storage parameters for the objects in the Database.
  • Used Update strategy and Target load plans to load data into Type-2/Type1 Dimensions
  • Improved performance by identifying the bottlenecks in Source, Target, Mapping and Session levels.
  • Design and Developed pre-session, post-session routines and batch execution routines.
  • Worked with XSD and XML files (sources & targets) to read & parse the data and load into the various target system.
  • Created various Oracle objects (tables, views, synonyms, indexes, partitions, functions, triggers etc.).
  • Developed various PL/SQL procedures using cursors, records, collections & dynamic SQL.

Environment: Informatica Power Center 9.1/9.5,IDQ, Oracle 10g,SQL server 2005/2000, SQR, XML, Windows, UNIX and PL/SQL Developer

Confidential

Informatica ETL Developer

Responsibilities:

  • Worked with IDQ to analyze the transactional data for the Products and create scorecards for the same.
  • Developed various Informatica mappings & mapplets to read the data from various source systems, validate the data against the complex business rules and the load the data into the Global Ledger.
  • Designed and developed the ETL logic to process multiple files for a single product using Indirect loading and generating multiple files for the same product using Transaction Control transformation.
  • Developed the Teradata Metadata entries, indexes and partitions for the various feeds sourced into the Global Ledger.
  • Worked on the Teradata Utilities BTEQ, Fast Export, Fast Load and Multi Load to extract and load huge volumes of data into the GGL.
  • Created a complex view to look up the static data for the various feeds from the database table maintained by Business Users.
  • Implemented concurrent workflow option to execute large number of multiple workflows concurrently.
  • Fine-tuned various long-running feeds by implementing Informatica partitions, splitting the Look up cache, using Persistent caches and enabling SONG feature (Session on Grid).
  • Designed and developed Unix Shell scripts to generate the run time parameters for the underlying ETL Informatica workflows from the DB Metadata entries.
  • Perform unit testing and provide support to the users for the defects raised in the UAT environment
  • Perform peer review of the code and preparation of the deployment plan after discussing with multiple stakeholders.
  • Co-ordinated with Business users, DBA teams & other stakeholders for any issues identified in UAT and work on the resolution.

Environment: Informatica Power Center 8.6, ETL, Oracle 10g/9i, DB2, PL/SQL, TOAD, SQL* Plus, SQL*Loader, SQL Server 2000, Windows Server 2000.

We'd love your feedback!