We provide IT Staff Augmentation Services!

Etl Developer Resume

2.00/5 (Submit Your Rating)

SUMMARY

  • Around seven plus years of experience in IT industry. six plus years in data warehousing and experience in designing and implementing Data Warehouse applications using ETL tool Informatica Power Center 10. 9.5.1, 9.1.1, 8.x, 7.x
  • Worked on various domains including Govt project, Finance, Health Care.
  • Efficient in creating source, target databases and developing strategies for Extraction, transformation and loading (ETL) mechanism using Informatica PowerCenter Designer, Repository Manager, Workflow Manager, and Workflow Monitor, Admin Console.
  • Experience wif all components such as Source Analyzer, Target designer, Transformation, Mappings Designer, Mapplets.
  • Worked wif complex mappings using different transformations such as Source Qualifier, Expression, Filter, Joiner, Router, Update Strategy, Union, Rank, Normalizer, Unconnected / Connected Lookup, JAVA, Sorter, Sequence Generator and Aggregator.
  • Experience in implementing Star Schema, Snowflake Schema
  • Design Source to Target maps, Code Migration, Version control, scheduling tools, Auditing, shared folders, data movement, naming in accordance wif ETL Best Practices, Standards and Procedures.
  • Focused on fixed and de - limited flat files, also worked wif direct and indirect flat files.
  • Worked in various Heterogeneous Database Systems like Oracle, DB2, Teradata, MS SQL Server flat files, XML documents, COBOL files, CSV, Excel and Legacy systems.
  • Proficient in dimensional modeling using Star and Snowflake schema, identifying Facts and Dimensions tables, data profiling, data cleansing and Data Staging of operational sources using ETL processes, Surrogate keys, Normalization/De normalization providing data mining features for data warehouses.
  • Solid understanding of relational database system structure, Oracle Database architecture and design.
  • Implementing design and implementation of SCD - slowly changing dimensions types (1, 2 and 3) and CDC - Change data capture.
  • Experience in providing System and User-defined Variables, Error handling by using Event Handler
  • Extensive experience in implementation of Data Cleanup procedures, transformations Scripts, Triggers, Stored Procedures and execution of test plans for loading the data successfully into the targets.
  • Comfortable in tuning SQL queries for time/ CPU consumption using clustered or non-clustered indexes.
  • Creation of indexes for enhancing performance for data access.
  • Knowledge on Oracle utility like SQL loader.
  • Good noledge in writing UNIX Shell Scripts and windows batch scripting for scheduling session for workflow and parsing files.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 10. 9.5.1, 9.1.1, 8.x, 7.x

Databases: Oracle 11g/10g, DB2, MS SQL Server 2012/2008, Teradata V13, Netezza

Tools: and Utilities: TOAD, WLM, AutoSys, PC based version control, Fast Load, Multi Load, Fast Export, T-Pump, AQT, SQL Plus, SQL Loader, SQL Developer, Putty

Programming & Scripting: C, C++, Unix shell scripting, JAVA, MS SQL, PL/SQL, Perl Scripting, Batch scripting

Operating Systems: Windows Server 2008, UNIX, LINUX

Methodologies: Star schema, Snowflake schema

Web Technologies: HTML, XML

PROFESSIONAL EXPERIENCE

Confidential

ETL Developer

Responsibilities:

  • Worked wif business analysts for requirement gathering, business analysis, and data management teams to translate the business requirements into technical specifications to build the Enterprise data warehouse.
  • Design database table structures for transactional and reference data sources.
  • Source data analysis and data profiling for data warehouse projects.
  • Design and implement all stages of data life cycle. Maintain coding and naming standards.
  • Developed Star schema as per business requirements
  • Involved in the data analysis for source and target systems and good understanding of Data Warehousing concepts, staging tables, Dimensions, Facts and Star Schema, Snowflake Schema.
  • Extract data from heterogeneous data sources wif multiple databases like Oracle, SQL Server, Fixed Width and Delimited Flat Files and transformed into a harmonized data store under stringent business deadlines.
  • Used features like email notifications, scripts and variables for ETL process using Informatica Power Center.
  • Extensively used ETL to load data from Flat files which involved both fixed width as well as Delimited files and also from the relational database, which was Oracle 11g and developed mappings using Informatica Power Center.
  • Extensively used Transformations like Source Qualifier, Expression, Filter, Aggregator, Joiner, Lookup, Sequence Generator, Router, Sorter and Stored Procedures, Java Transformations.
  • Used debugger to test the mappings and fixed the bugs.
  • Involved in fixing of invalid Mappings, performance tuning, testing of stored procedures and functions, testing of Informatica sessions, batches and the target data.
  • Developed a reusable audit tracking design for tracking the file load time, filename and statistics.
  • Worked on developing Web service calls for the data unavailable in the lookup tables.
  • Developed java scripts for web services.
  • Developed slowly changed dimensions (SCD) Type 2 for loading data into Dimensions and Facts.
  • Expertise in doing Unit Testing, System Testing and Data Validation for developed Informatica Mappings.
  • Extensively worked on Data Cleansing, Data Analysis and Monitoring Using Data Quality.
  • Carry out Defect Analysis and fixing of bugs raised.
  • Exception/error handling to catch unusual data from the source.
  • Created the Test cases and Captured Unit test Results.
  • Developed and tested all the Informatica mappings, sessions and workflows - involving several Tasks.
  • Involved, conducted and participated in process improvement discussions, performance tuning and recommending possible outcomes and focused on production application stability and enhancements.
  • Parameterized the developed ETL and also automated loading process using UNIX shell scripts.
  • Used Tidal scheduling tool to schedule workflows and batch loads.
  • Production support on rotation basis.
Environment: Informatica Power Center 9.6.1, B2B DT/DX, Data Transformation Studio, HL7Spy, Oracle 11g, SQL Server 2000, SSRS, SQL Plus, SQL Developer, UNIX, WinSCP, Putty, Autosys, Tidal, Automic, OBIEE.

Confidential

ETL Informatica Developer

Responsibilities:

  • Worked wif business analysts for requirement gathering, business analysis, and data management teams to translate the business requirements into technical specifications to build the Enterprise data warehouse.
  • Design database table structures for transactional and reference data sources.
  • Source data analysis and data profiling for data warehouse projects.
  • Design and implement all stages of data life cycle. Maintain coding and naming standards.
  • Developed Star schema as per business requirements
  • Involved in the data analysis for source and target systems and good understanding of Data Warehousing concepts, staging tables, Dimensions, Facts and Star Schema, Snowflake Schema.
  • Extract data from heterogeneous data sources wif multiple databases like Oracle, SQL Server, Fixed Width and Delimited Flat Files and transformed into a harmonized data store under stringent business deadlines.
  • Used features like email notifications, scripts and variables for ETL process using Informatica Power Center.
  • Extensively used ETL to load data from Flat files which involved both fixed width as well as Delimited files and also from the relational database, which was Oracle 11g and developed mappings using Informatica Power Center.
  • Extensively used Transformations like Source Qualifier, Expression, Filter, Aggregator, Joiner, Lookup, Sequence Generator, Router, Sorter and Stored Procedures, Java Transformations.
  • Used debugger to test the mappings and fixed the bugs.
  • Involved in fixing of invalid Mappings, performance tuning, testing of stored procedures and functions, testing of Informatica sessions, batches and the target data.
  • Developed a reusable audit tracking design for tracking the file load time, filename and statistics.
  • Worked on developing Web service calls for the data unavailable in the lookup tables.
  • Developed java scripts for web services.
  • Developed slowly changed dimensions (SCD) Type 2 for loading data into Dimensions and Facts.
  • Expertise in doing Unit Testing, System Testing and Data Validation for developed Informatica Mappings.
  • Extensively worked on Data Cleansing, Data Analysis and Monitoring Using Data Quality.
  • Carry out Defect Analysis and fixing of bugs raised.
  • Exception/error handling to catch unusual data from the source.
  • Created the Test cases and Captured Unit test Results.
  • Developed and tested all the Informatica mappings, sessions and workflows - involving several Tasks.
  • Involved, conducted and participated in process improvement discussions, performance tuning and recommending possible outcomes and focused on production application stability and enhancements.
  • Parameterized the developed ETL and also automated loading process using UNIX shell scripts.
  • Used Tidal scheduling tool to schedule workflows and batch loads.
  • Production support on rotation basis.
  • Hands on OBIEE reporting.
  • Hands on experience in Teradata SQL Assistant
  • Followed complete development cycle from estimation to deployment iteratively Via Agile and dynamic methodologies.
  • Root Cause Analysis Reporting and closure of Production Issues wif in the pre-defined SLA’s.
  • Extensively used all Informatica Client tools and various transformations such as Lookup, Filter, Expression (reusable), Router, Rank, Joiner, Update Strategy, Aggregator, Stored Procedure, Sorter, Sequence Generator and Normalizer.
  • Designed Informatica mapplets and worklets to utilize the reusability feature of Informatica.
  • Used the update strategy to TEMPeffectively load data from source to target.
  • Knowledge on Kimball, Inmon methodologies.
  • Knowledge on versioning tool and scheduling tool like Autosys.s
  • Comfortable in SQL queries tuning for time consumption.
Environment: Informatica Power Center 9.5.1, Oracle 11g, Unix, SQL server 2008, XML, TOAD, PL/SQL, ClearCase, BTEQ scripts, WLM, Teradata V13, Windows server 2012, Teradata utilities.

We'd love your feedback!