We provide IT Staff Augmentation Services!

Informatica/hadoop Developer Resume

4.00/5 (Submit Your Rating)

San Jose, CA

PROFESSIONAL EXPERIENCE:

  • Overall 7+years’ experience in IT Industry with Data warehousing, OLAP reporting tools, ETL tools using industry best methodologies and procedures Including Data Governance, Data Integration and Data Quality Assurance.
  • Expert knowledge in working with Data Warehousing ETL using Informatica Power Center 10.x/9.x/8.x (Designer, Repository manager, Repository Server Administrator console, Server Manager, Work flow manager, workflow monitor).
  • Good knowledge of Data warehouse concepts and principles (Kimball/Inman) - Star Schema, Snowflake, Enterprise Data Vault, SCD, Surrogate keys, Normalization/De normalization, Data marts.
  • Sound background in Hadoop concepts and methodologies with demonstrated expertise to apply this knowledge in building solutions.
  • Experience in Working in Hadoop. Created table Structures in Hive environment and loaded data into them.
  • Experience in integration of various data sources with Relational Databases like Oracle, SQL Server, DB2 and Worked on integrating data from flat files.
  • Extensive experience in using various Informatica Designer Tools like Source Analyzer, Mapping Designer, Transformation Developer, Mapplet Designer.
  • Extensively worked on Informatica Power Center Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter, Sequence Generator, Normalizer, Union, and XML Source Qualifier.
  • Highly experienced in developing, designing, reviewing and documenting Informatica work products like Mappings, Mapplet, Shortcuts, Reusable transformations, Sessions, Workflows, Worklets, Schedulers and experienced in using Mapping parameters, Mapping variables, Session parameter files.
  • Good proficiency in Informatica Data Quality.
  • Created mappings in Informatica Data Quality ( IDQ ) using Parser, Standardizer and Address Validator Transformations.
  • Experienced in creating IDQ mappings using Labeler, Standardizer, Address Validator transformations with Informatica Developer and migrated to Informatica Power Center.
  • Profiled the data using and performed Proof of Concept for Informatica Data Quality ( IDQ ).
  • Extensive experience in error handling and problem fixing using Informatica.
  • Extensively worked on data extraction, Transformation and loading data from various sources like Oracle, SQL Server, Teradata and files like Flat files.
  • Hands on experience in Performing tuning mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, transformations, mappings and sessions.
  • Expertise in implementing complex Business rules by creating complex mappings/Mapplet, shortcuts, reusable transformations, Portioning Sessions.
  • Experience in troubleshooting by tuning mappings, identify and resolve performance bottlenecks in various levels like source, target, mappings, and session.
  • Involved in Implementing Slowly Changing Dimensions, Star Schema modeling, Snowflake modeling, FACT tables, Dimension tables, De normalization.
  • Exposure to Informatica B2B Data Exchange that allows to support the expanding diversity of customers and partners and their data with capabilities that surpass the usual B2B solutions
  • Exposure to Informatica B2B Data Transformation that supports transformation of structured, unstructured, and semi-structured data types while complying with the multiple standards, which govern the data, formats.
  • Experience in Oracle9i/10g/11g, SQL Server, Teradata, SQL Loader and DB2 etc.
  • Extensive Experience in UNIX (AIX/Solaris/ HP-UX 11.x), and Windows Operating system.
  • Extensively used SQL and PL/SQL to write Stored Procedures, Functions, and Packages.
  • Developed excellent overall software Development life cycle (SDLC) professional skills by working independently and as a team member to analyze the Functional/ Business requirements and to prepare test plans, test scripts. Collaborated onsite teams, interacted and well managed various offshore teams.

TECHNICAL SKILLS:

ETL Tools: InformaticaPowerCenter 10.1/9.6.1/9.1/8. x,IDQ,Informatica MDM Hub 9.5, 9.7,10.

Databases: Oracle 11g/10g/9i/8i, SQL Server 2000/2005/2008 , Teradata, DB2, Amazon Red shift and Hive.

Languages: SQL, T-SQL, PL/SQL, UNIX Shell Scripting, Windows Batch Scripting, Hive,postgre SQL.

Operating Systems: UNIX, Windows 2000/2003/NT/XP, Linux, Sun Solaris, HP-UNIX, IBM AIX.

BusinessIntelligence Tools: COGNOS, Business Objects (BO).

DB Tools: TOAD, SQL * Plus, PL/SQL Developer, SQL * Loader, Fast Load. M Load, SQL work Bench

Web Technologies: HTML, XML, Java Script

Data Modeling Tools: Erwin, Embarcadero Studio

Other Tools: HP Quality center, WIN SCP, PUTTY, JIRA, MS-Office, Autosys, Tidal, Rally, and JupiterNotebook.

PROFESSIONAL EXPERIENCE:

Confidential, San Jose, CA

Informatica/Hadoop Developer

Responsibilities:

  • Design, develop, maintain, monitor and execute production ETL processes using Informatica Power Center.
  • Developed Required Mapping and Testing Data flow.
  • Making required Production changes and helping in production issues.
  • Monitoring and executing history and Incremental load tables.
  • Loading data into Hadoop Database and helped in creating H-Model.
  • Worked on Amazon Redshift Data base to pull data and load into Hive Environment.
  • Worked in hive Environment to create Tables And made necessary changes to load data into H-Model
  • Extensively used Informatica Power Center to extract and cleanse datafrom various sources and load in to staging database.
  • Complete projects and development activities timely and accurately while following the System Development Life Cycle (SDLC).
  • Suggest changes and enhancements for ETL processes.
  • Worked with various transformations like Source Qualifier, Lookup, Stored Procedure, Sequence Generator, Router, Filter, Aggregator, Joiner, Expression, and Update Strategy.
  • Helped the team in analyzing the data to be able to identify the data quality issues.
  • Created Workflow, Worklets and Tasks to schedule the loads at required frequency using Workflow Manager.
  • Extensively worked with aggregate functions like Avg, Min, Max, First, Last, and Count in the Aggregator Transformation.
  • Extensively used SQL Override, Sorter, and Filter in the Source Qualifier Transformation.
  • Extensively used Normal Join, Full Outer Join, Detail Outer Join, and master Outer Join in the Joiner Transformation.
  • Document hours and development activities by following SDLC and IS Change Management guidelines
  • Scheduled Informatica workflows using power center Scheduler
  • Used SQL tools like TOAD to run SQL queries and validate the data in warehouse and mart
  • Involved in debugging mappings, recovering sessions and developing error-handling methods
  • Resolved memory related issues like DTM buffer size, cache size to optimize session runs
  • Used Session Parameters to increase the efficiency of the sessions in the Workflow Manager
  • Wrote complex SQL scripts to avoid Informatica Look-ups to improve the performance as the volume of the data was heavy.
  • Tuned performance of Informatica session for large Data Files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Worked on SQL tools like TOAD to run SQL queries and validate the data
  • Developed complex reports that involved multiple data providers, master/detail charts, complex variables and calculation contexts.

Environment: Environment: Oracle 11g, Amazon Redshift, Informatica Power Center 10.1/9.6.1, SQL Developer, SQL work Bench, Jupiter Note Book, Putty and Hive

Confidential, Jackson, MI

ETL Informatica Developer

Responsibilities:

  • Conducted JAD sessions with business users and SME's for better understanding of the reporting requirements.
  • Design and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts.
  • Develop high level and detailed level technical and functional documents consisting of Detailed Design Documentation function test specification with use cases and unit test documents.
  • Analysis of source systems and work with business analysts to identify study and understand requirements and translate them into ETL code.
  • Handled technical and functional call across the teams.
  • Responsible for the Extraction, Transformation and Loading (ETL) Architecture & Standards implementation. .
  • Responsible for offshore Code delivery and review process
  • Used Informatica to extract data from DB2, UDB, XML, Flat files and Excel files to load the data into the Warehouse.
  • Worked in all phases of Data Integration from heterogeneous sources, legacy systems to Target Database.
  • Worked on Informatica Power Center tool - Source Analyzer, Warehouse designer, Mapping and Mapplet Designer, Transformations, Informatica Repository Manager, Informatica Workflow Manager and Workflow Monitor.
  • Designed and developed complex Type 1 and Type 2 mappings for the stage to integration layer in the project.
  • Involved in Design Review, code review, test review, and gave valuable suggestions.
  • Worked with different Caches such as Index cache, Data cache, Lookup cache (Static, Dynamic and Persistence) and Join cache while developing the Mappings.
  • Created partitions for parallel processing of data and also worked with DBAs to enhance the data load during production.
  • Performance tuned Informatica session, for large data files by increasing block size, data cache size, and target based commit.
  • Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions).
  • Involved in writing a procedure to check the up-to-date statistics on tables.
  • Used Informatica command task to transfer the files to bridge server to send the file to third party vendor.
  • Took part in migration of jobs from UIT to SIT and to UAT.
  • Created FTP scripts and Conversion scripts to convert data into flat files to be used for Informatica sessions.
  • Involved in Informatica Code Migration across various Environments.
  • Performed troubleshooting on the load failure cases, including database problems.

Environment: Informatica Power Center 9.1 , Oracle 11g, MS SQL Server 2008, Teradata, HP Quality Control.

Confidential, Fort Wayne, IN

ETL Developer

Responsibilities:

  • Worked with power center tools like Designer, Workflow Manager, Workflow Monitor,Repository Manager
  • Worked on Designer tools like Source Analyzer, Warehouse Designer, Transformation Developer and Mapping Designer.
  • Analyze the raw data from alliance partners and perform coding to generate the reports according to the Data model.
  • Responsible for determining the bottlenecks with Informatica Server and fixing the issues by cleaning the Informatica repository logs on timely basis.
  • Involved in data modeling and design of data warehouse in star schema methodology with confirmed granular dimensions and Fact tables.
  • Developed mappings in Informatica to load data into Marts and different other layers like Persistent Staging and Data Warehouse.
  • Used Informatica Debugger to troubleshoot data and error conditions.
  • Maintain documents for all the Development work done and error fixings performed.
  • Involved in deploying and redesigning of several small ETL processes for the existing research line of business.
  • Involved in developing reports using Business Objects.
  • Migrated ETL code across different ETL environments like Dev, QA, UAT, PROD.
  • Used tools like TOAD and SQL navigator to run the queries and validate the data.
  • Responsible to Run Unix scripts to startup the Java Web Service to generate the Load confirmation emails to users.

Environment: Informatica Power Center 8.6/9.1 , oracle 9i, 11g PL/SQL, TOAD, UNIX, Erwin 4.2/7.x, Windows XP Professional, FTP, MS-Excel, Teradata, MS-Access, Maestro.

Confidential

SQL/ETL Developer

Responsibilities:

  • Extraction, Transformation and data loading were performed using Informatica into the database. Involved in Logical and Physical modeling of the drugs database.
  • Designed the ETL processes using Informatica to load data from Oracle, Flat Files, XML Files to target Oracle Data Warehouse database.
  • Based on the requirements created Functional design documents and Technical design specification documents for ETL.
  • Created tables, views, indexes, sequences and constraints.
  • Developed stored procedures, functions and database triggers using PL/SQL according to specific business logic.
  • Transferred data using SQL Loader to database.
  • Involved in testing of Stored Procedures and Functions. Designed and developed table structures, stored procedures, and functions to implement business rules.
  • Used legacy systems, Oracle, and SQL Server sources to extract the data and to load the data.
  • Involved in design and development of data validation, load process and error control routines.
  • Analyzed the database for performance issues and conducted detailed tuning activities for improvement.
  • Generated monthly and quarterly drugs inventory/purchase reports.
  • Coordinated database requirements with Oracle programmers and wrote reports for sales data.

Environment: Informatica Power Center 9.1, Oracle 8i, XML, SQL, PL/SQL, UNIX, Teradata.

Confidential

SQL/ETL Developer

Responsibilities:

  • Involved in analyzing the data models of legacy implementations, identifying the sources for various dimensions and facts for different data marts according to star schema design patterns.
  • Developed complex mapping using Source qualifier, Aggregators, Connected & unconnected lookups, Filters & Update strategy.
  • Extensively used Various Data Cleansing and Data Conversion Functions in various transformations.
  • Used Debugger to validate transformations by creating break points to analyze, and monitor Data flow.
  • Tuned performance of Informatica Session by increasing block size, data cache size, sequence buffer length and Target based commit interval, and mappings by dropping and recreation of indexes.
  • Worked along with the QA Team and provided production support by monitoring the processes running daily.
  • Defined Target Load Order Plan for loading data into Target Tables.
  • Implemented Slowly Changing Dimensions methodology and developed mappings to keep track of historical data
  • Written SQL overrides in Source Qualifier and Lookups according to business requirements.
  • Involved in troubleshooting the loading failure cases, including database problems.
  • Responsible for Documentation of the Test cases, completed modules and the results of acceptance testing.

Environment: Informatica Power Center 9.1, Oracle 9i, MS SQL Server 2005, UNIX, PL/SQL, UNIX shell scripting, SQL*PLUS, SQL, TOAD, Reports, MS Excel, MS Access, Flat Files, XML

We'd love your feedback!