We provide IT Staff Augmentation Services!

Etl-informatica Developer Resume

2.00/5 (Submit Your Rating)

Charlotte, NC

SUMMARY:

  • Over 7 Years experience in Full life cycle of Software Development (SDLC) including Business Requirement Gathering & Analysis, System study, Application Design, Development, testing, Implementation, System Maintenance, Support and Documentation.
  • Data Warehousing: Over 7 years of strong experience in Data Warehousing and ETL using Informatica Power Center 9.1/ 8.6.1/7.x/6.2/5.1, Informatica PowerExchange for SAP, Informatica Power Mart 5.1, (Workflow Manager, Workflow Monitor, Warehouse Designer, Source Analyzer, Warehouse Designer, Transformation developer, Mapplet Designer, Mapping Designer, Repository manager, Workflow Manager, Workflow Monitor, and Informatica server), Powerconnect, PowerPlug, Power Analyzer, ETL, ETI, Datamart, Talend, OLAP, OLTP, SQL, PL/SQL, Complex Stored Procedures, Triggers.
  • Data Modeling: 6 years of experience working on End - To-End implementation of Data warehouse and Strong understanding of Data warehouse concepts, ETL, Star Schema, Surrogate keys, Dimensional Data Modeling experience on Data modeling, ERwin 4.5/4.0/3.5.5/3.5.2, Dimensional Modeling, Ralph Kimball Approach, Star/Snowflake Modeling, Datamarts, OLAP, FACT & Dimensions tables, Physical & Logical data modeling and Oracle Designer.
  • Business Intelligence: 3 years of Business Intelligence experience using Cognos 7.0/6.0/5.x and experience in OBIEE,SSRS.
  • Databases: Over 7 years of experience using Oracle 10g/9i/8i/8.0/7.0, MYSQL, MS SQL Server 2005/2000/7.0/6.0, Sybase 12.x/11.x, MS Access 7.0/2000, Oracle Report Writer, SQR 3.0, SQL, XML, PL/SQL, SQL*Plus, SQL*Loader and Developer 2000

TECHNICAL SKILLS

Data Warehousing: Informatica Power Center 9.1/8.6.1/7.X/6.2/5.1, Informatica power exchange, Informatica Power Mart 6.2/5.1/4.7, (Workflow Manager, Workflow Monitor, Warehouse Designer, Source Analyzer, Warehouse Designer, Transformation developer, Mapplet Designer, Mapping Designer, Repository manager, Workflow Manager, Workflow Monitor, and Informatica server, PowerConnect, PowerConnect Striva, PowerPlug, Power Analyzer, ETL, Cobol, Siebel 7.0/2000/5.5, Data Mining, Datamart, Talend, OLAP, OLTP, SQL* Plus, SQL*Loader

Dimensional Data Modeling: Dimensional Data Modeling, Data Modeling, Star Join Schema Modeling, Snowflake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, ERWIN 4.5/4.0/3.5/3.2, Oracle Designer, Visio and Sybase Power Designer.

OLAP Tools: Cognos,SSRS, OBIEE 10.1.3.

Databases: Oracle 10g/9i/8i/8.0/7.x, MYSQL, SQL*Plus, SQL*Loader, MS SQL Server 2000/7.0/6.0, Sybase 12.x/11.x, MS Access.

Programming GUI: C, C++, SQL, PL/SQL, SQL Plus, Transact SQL, ANSI SQL, SQL Scripting, HTML, DHTML, Unix Shell Scripting, PERL Programming

Environment: Sun Solaris 5.5/5.4, HP-UX 10.20/9.0, IBM AIX 5.3/4.3, MS DOS 6.22, Novell NetWare 4.11/3.61, Win 3.x/95/98, Win NT, Red Hat Linux, Win 3.x/95/98, Win 2000, Win XP. MS-Dos

PROFESSIONAL EXPERIENCE:

Confidential, Charlotte NC

ETL-Informatica Developer

Responsibilities:

  • Involved in end-to-end implementation of Data warehousing projects, which include Business Requirements gathering, Analysis, System study, Prepare Functional & Technical specifications, Design (Logical and Physical model), Coding, Testing, Code migration, Implementation, System maintenance, Support, and Documentation.
  • Developed complex mappings using Informatica Power Center Designer to transform and load the data from various source systems from My SQL, Oracle, SQL Server, Flat files to load into Staging, ODS, Enterprise Data warehouse, and subject oriented Data marts for Sales, Marketing and Finance departments.
  • Developed various Informatica mappings, sessions, and workflows Automated jobs using Unix scripts and Tidal job scheduling tool.
  • Developed PL/SQL Stored Programs (Procedures & Functions) to do data transformations and integrated them with Informatica programs.
  • Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
  • Created UNIX shell scripts for Automation and to invoke the stored procedures for the control process and parameter file generation.
  • Extensively worked on source/target flat files and used FTP, SFTP, and SSH in UNIX scripts.
  • Worked on bug fixes on existingInformaticaMappings to produce correct output.
  • Design ETL procedures to efficiently process large volumes of data.
  • Implemented complex business rules using various complex transformations, Oracle packages, created custom DML scripts, stored procedures, function, triggers, analytical functions
  • Fine-Tune the ETL processes Informatica mappings, sessions,workflows, Oracle SQL queries, PL/SQL procedures to improve the efficiency of the load process.
  • Identify and analyze data discrepancies and data quality issues and work to ensure data consistency and integrity. Performed audit on ETLs and validated source data Vs target table loads.

Environment: Informatica Power Center 9.1/8.6.1, Talend, Oracle 10g/9i, SQL Server 2005, SSRS, MYSQL, SQL*Plus, PL/SQL, TOAD, Erwin 4.0, Windows NT, Win SCP,Quality Center 9.0.

Confidential, CA

ETL-Informatica Developer

Responsibilities:

  • Designed ETL process, load strategy and the requirements specification after having the requirements from the end users.
  • Responsible for creating, importing all the required sources and targets to the shared folder.
  • Designed and developed Informatica mappings to load source data into target systems.
  • Configured sessions, setup workflow and tasks to schedule data loads at desired frequency.
  • Maintaining the Release related documents by Configuration management techniques.
  • Designed and Created data cleansing and validation scripts using Informatica ETL tool.
  • Provided resolution to issues by correcting and validating specification changes.
  • Algorithm design to implement type II slowly changing dimension.
  • Created Folders and placed all the tested ETL, Oracle and UNIX scripts in the Staging path for production movement.
  • Created UNIX scripts for handling the ftp of source files, to make them execute in sequence as per the time stamp and to archive the processed files for future reference.
  • Extensively worked in data Extraction, Transformation and loading from source to target system using BTeq, Fast Load, and MultiLoad.
  • Involved in writing scripts for loading data to target data Warehouse for BTeq, Fast Load and MultiLoad.
  • Did error handling and performance tuning in Teradata queries and utilities.
  • Did data reconciliation in various source systems and in Teradata.
  • Involved table partition in database.
  • Developed dashboards and dashboard pages.
  • Create the reports like Table, Chart, and Pivot and publish them in dashboards using Answers & Dashboards.
  • Designed Created and Tested Report in the Dashboard and created the adhoc report according to the client needs.
  • Designed, Developed and Tested Data security and Dashboard security in OBIEE.
  • Developed dashboards and dashboard pages.
  • Involved in modifying exisisting report based on the requirement.
  • Created prompts and filters.
  • Involved in Creation of RPD.
  • Created Dimensional Hierarchies in Logical Layer to achieve Drill down functionality.
  • Prepared the data flow of the entire Informatica objects created, to accomplish Integration testing.
  • Scheduled the jobs in Dev and Testing environment using Informatica Scheduler.
  • Prepared validation scripts for user acceptance testing.
  • ETL component tuning.

Environment: Informatica Power Center 8.6/7.1, (Source Analyzer, Warehouse Designer, Mapping Designer, Transformation Developer, Workflow Manager), Tidel 5.3.1,Toad 9.1 Oracle 10g/9i, MS SQL Server, PL/SQL, OBIEE, Teradata, Flat Files, Windows NT, UNIX, Hp Quality center 9.0.

Confidential

ETL Developer

Responsibilities:

  • Involved in developing mappings from flat file sources to flat file targets.
  • Lookup procedures have been resolved on Flat Files using Joiners and expressions which are not supported by Informatica.
  • Normalizer transformation is used to create multiple target records for various types of source fields coming from different sources and terminating on same target.
  • Lookup on a single file for different conditions are resolved using multiple Filters and Joiners.
  • Indirect and direct reading of flat files interchangeably is main feature.
  • Handled situations like targets of previous mappings as a file list sources for the following mappings using Decision tasks.
  • Fine tuning of mapping, handling data flow violations, controlling un-necessary flow of Data in the mappings is important job.
  • Weekly and Monthly loads are performed on the frequently changing source files.

Environment: Windows-2000, Informatica Power Center 6.2, oracle 8i.

We'd love your feedback!