We provide IT Staff Augmentation Services!

Sr. Informatica Developer |data Analyst|idq|hadoop Resume

2.00/5 (Submit Your Rating)

Los Angeles, CA

SUMMARY:

  • Senior ETL professional with 8+ years of experience working on various ETL components like Informatica Power Center, Informatica Data Quality, SQL, Data Warehousing, Databases, Data Modeling, Data Analytics and Big data.
  • Skillful in Data Processing experience in designing and implementing Data Mart applications, mainly transformation processes using ETL tools Informatica Power Center,IDQ also skillful in managing Master Database Management (MDM).
  • Very Good Exerience in Enterprise Data Warehouses (EDW), Operational Data Store (ODS), Data Marts, Dimensional modeling techniques using Star and Snowflake schema.
  • Strong experience in all phases of Software Development Life Cycle (SDLC) and Proficient in analyzing business processes / requirements and translating them into Technical requirements.
  • Skillful in integrating complex Enterprise data integration using PowerCenter Advancededition version 9.x
  • Expertise in various Informatica Tools - Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer, Transformation Developer, Workflow Manager and Workflow Monitor.
  • Skillful in designing and developing Informatica mappings from varied transformation logic using filter, expression, rank, router, aggregator, joiner, lookup, update strategy, stored procedures etc.
  • Strong Experience in integration of various data sources like Teradata, Oracle, SQL Server, and Flat Files into the data warehouse.
  • Very good Experience in working with Relational Databases and database objects like SQL, PL/SQL Stored procedures, Functions, Packages, Triggers, Cursors.
  • Skillful in querying the databases by developing and running SQL and PL/SQL queries using tools like SQL Developer, TOAD, SQL Assistant and SQL Query Analyzer.
  • Very good experience in testing activities like Unit Testing, Integration-system Testing and User Acceptance Testing (UAT) and ensured that the code is ideal.
  • Proficient experience as a Data Analyst in gathering data from different sources, data profiling, data definition, and loaded the data to business warehouse.
  • Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export using multiple ETL tools Informatica Power Center.
  • Excellent experience in Data mining with querying and mining large datasets to discover transition patterns and examine financial data.
  • Experienced to work as a Data Analyst to perform complex Data Profiling, Data definition, Data Mining, validating and analyzing data.
  • Experience in UNIX shell scripting, automation of ETL processes using Autosys.
  • Strong knowledge of Hadoop platforms and other distributed data processing platforms.
  • Strong Knowledge on Hadoop architecture and various components such as HDFS, Map Reduce, HBase, Hive, Pig, Sqoop.
  • Very Good Experience in UNIX Shell Scripting & Python.
  • Experienced in Agile software development methodology, working with team members in sprints, standup meetings.
  • Well-developed interpersonal and communication skills.
  • Highly self-motivated and able to set effective priorities to achieve immediate & long-term goals and meet project & operational deadlines.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 9.6, 9.5, 9.1, 8.6, (Designer, Workflow Manager, Workflow Monitor, v Repository Manager)

DWH: Oracle Datawarehouse, SAP HANA DW, Teradata DWH

Test Management Tools: HP ALM/Quality Center, Jira, Rally

Scheduling Tools: Autosys, Tivoli, UC4

Data Quality: Informatica Data Quality (IDQ), DVO (Data Validation Option)

Operating Systems: Windows 7/10, Windows Vista/XP/2000, UNIX/Linux, Mac.

Databases: Oracle 8i/9i/10g/11g, RDBMS, Teradata V2R12, V2R6b Microsoft SQL Server 2008/2012/2014

Development tools: Toad, Teradata SQL Assistance, SQL Developer

Bigdata: Hadoop, Hdfs, Hive, Pig, Sqoop.

Programming/Languages: SQL, PL/SQL, UNIX shell scripting

SDLC Methodologies: Agile/Scrum, Waterfall, Iterative

Microsoft Suite: Microsoft Office, Word, Excel, Power Point, Outlook, One Drive

PROFESSIONAL EXPERIENCE:

Confidential, Los Angeles, CA

Sr. Informatica Developer |Data analyst|IDQ|Hadoop

Responsibilities:

  • Analyzed the technical requirements and data flows and prepared technical strategy documents.
  • Worked with heterogeneous sources including relational sources and flat files.
  • Installed required command line utilities for UNIX on the server where UC4 is running and wrote wrapper scripts to call the scripts from UC4.
  • Built Physical Data Objects and developed various Mapping, Mapplets/rules using Informatica Data Quality based on requirements to profile, validate and cleanse the data.
  • Designed and developed Technical and Business Data Quality rules in IDQ and created the Score Card to present it to the Business users for a trending analysis (Informatica Analyst).
  • Defined Business rules in Informatica Data Quality (IDQ) to evaluate data quality by creating cleanse processes to monitor compliance with standards and assist in resolving data quality issues.
  • Worked on UNIX scripts to manipulate files and data in UNIX and HADOOP.
  • Worked on Hadoop HDFS system to copying files from local to HDFS file system.
  • Imported data from RDBMS databases into HDFS and Hive using Sqoop.
  • Used Sqoop to extract the data from Teradata into Hadoop cluster and worked on Sqoop jobs with incremental load to populate Hive tables.
  • Validated the Informatica ETL mappings to load the data from HDFS to Teradata data warehouse.
  • Extracted Data from Teradata using Informatica Power Center ETL and DTS Packages to the target database including SQL Server and used the data for Reporting purposes
  • Worked on Tidal to run Informatica and Hadoop jobs parallel.
  • Developed data pipelines using Sqoop, Pig and Hive to ingest customer member data, clinical, biometrics, lab and claims data into HDFS to perform data analytics.
  • Extensively worked on 837 HIPPA files for loading the claims/Encounter data to enterprise data warehouse.
  • Processing claims through EDI 837 files to FACETS system and worked on scenarios for complete claims lifecycle.
  • Responsible to ensure members are getting loaded correctly to CSP Facets through 834 files.
  • Used informatica B2B data transformation studio for parsing the HIPPA files using the parsers and HIPPA libraries.
  • Created ETL mappings and Mapplets to extract data from ERPS like SAP and load into EDW and developed transformation logic using Informatica.
  • Developed Mappings, Sessions and Workflows to extract, validate, and transform data as per the business rules.
  • Developed complex mappings in Informatica to load the data from source files using different transformations like Source Qualifier, look up (connected and unconnected), Expression, Aggregate, Update Strategy, Joiner, Filter, Stored Procedure, normalize, Router and Mapplets.
  • Extensively used Informatica Functions LTRIM, RTRIM, DECODE, ISNULL, IIF, INSTR and date functions in Transformations.
  • Wrote several UNIX Shell Scripts for cleanup, logging, file manipulation and transferring the files.
  • Created PL/SQL Functions and called them using ETL Informatica store procedure Transformations.
  • Created simple, complex and materialized views to retrive the data from one or more tables.
  • Created Indexes(B-Tree, Bit-Map) to improve the Query performance.
  • Built store procedures and used SQL Loader Utility for bulk and faster loading data in oracle tables
  • Used PL/SQL collections to improvise stored Procedure Performance
  • Wrote modules in Python to connect to Oracle DB and doing CRUD operations.
  • Developed ETL processes that are used to integrate data between entities using Python, and Oracle API's
  • Involved in review of the mappings and enhancements for better optimization of the Informatica mappings, sessions and workflows.
  • Worked on performance tuning of programs, ETL procedures and processes.
  • Coordinating with the QA team in various testing phases and worked on the defects reported by the testing team.
  • Created ETL test data for all ETL mapping rules and tested the functionality of the Informatica Mappings.
  • Executed the Test Cases with Pass, Fail, and Blocked status.
  • Prepared test data using MS Excel sheet, created data driven test for testing the application with positive and negative inputs.
  • Coordinated with the BI team and provided technical documentation.
  • Closely Interqcted with team members and business Analysts and Worked in Agile environment.
  • Worked with big data teams to move ETL tasks to Hadoop.
  • Project and Team management, work allocation and status reporting.

Environment: Informatica Power Center 9.6.1,Informatica Data Quality (IDQ) 9.6.1,SQL, PL/SQL, Oracle 11g, Teradata, DB2, Toad, Teradata SQL Assistant, SQL Loader, SAP HANA, HIPPA, SAP BW, SAP Connectors, Autosys, Facets, Flat files, UC4 Job Scheduler, Python, Apache Hadoop, HDFS, Hive, Sqoop,spark,Tidal, XML Files, Agile/Scrum, Excel, UNIX.

Confidential, New York

Informatica Developer | IDQ Developer|Data Analyst

Responsibilities:

  • Responsible for coordinating with the Business Analysts and users to understand business and functional requirements and implement the same into an ETL design document
  • Worked with the functional and technical teams like SAP HR and gathered the business requirements for extracting data.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.
  • Developed Stored Procedures and used them in Stored Procedure transformation for data processing and have used data migration tools.
  • Extensively used PVCS to check in Move forms, paramater files, and database scripts required for migration.
  • Developed various Informatica Mappings & Mapplets to load data using different transformations like Source Qualifier, Joiner, Router, Sorter, Aggregator, Connected and Unconnected Lookup, Update Strategy, Expression, Stored Procedure, Sequence Generator to load the data into the target tables.
  • Used various transformations connected and unconnected lookups sap data integrators as SAP source system for maintaining data consistency using Informatica power center.
  • Designed complex mapping logic to implement SCD1 and SCD2 dimensions and worked on many critical dimensional modeling which helps in structuring and organizing data in uniform manner where constraints are placed within structure major concept of Data modeling.
  • Extensively worked on Informatica IDE/IDQ.
  • Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality 9.1 environment.
  • Developed UNIX Shell scripts to automate repetitive database processes.
  • Developed transformation logic to cleanse the source data of inconsistencies during the source to stage loading.
  • Developed ETL jobs which consumes data from SAP Fico systems and loaded into SAP data warehouse
  • Used Debugger to test the mappings and fixed the bugs
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels
  • Used Informatica Workflow Manager to schedule workflows and created sequential and parallel loads and used workflow monitor to monitor the load process and trouble shouted the session failure getting log files
  • Created task like session, event wait, event raise, command, decision and control task using workflow manager and monitored the jobs in workflow manager
  • Enhancements to the existing data sources to meet the customer’s requirements
  • Involved in performing Integration testing, System testing, Functionality testing, UAT and Regression testing.
  • Wrote SQL queries and Scripts to validate source data versus data warehouse data including identification of duplicate records.
  • Implemented Teradata Fast load, Multiload and Bteq scripts DML and DDL.
  • Wrote complex queries in Teradata SQL assistant to check the data from Source and Target.
  • Analyzed the store procedures in the database, understand how the code is implemented and validated the data with the functional requirements.
  • Develop Functional and integration test cases and review Test Cases wrote by peers.
  • Created Test Cases, and got reviewed those by peers.
  • Create and executed data driven test scripts in QTP
  • Tested several complex ETL mappings and reusable transformations for daily data loads.
  • Performance tuning in Informatica Mappings by identifying the bottlenecks and Implemented effective transformation Logic
  • Working closely with business users and technical teams in a customer facing role to bring requirements together into a cohesive design
  • Worked with the SAP production team and Provided 24 x 7 production support and maintenance for all the applications with the ETL process
  • Provided data to the reporting team for their daily, weekly and monthly reports.

Environment: Informatica Power Center 9.5/8.6, Informatica Data Quality 9.1, workflow Manager, workflow Monitor, DVO, Oracle 11g, IBM Cognos, PL/SQL,Teradata, SQL Developer, SQL Server, SSMS, Teradata SQL Assistant, Excel, Control-M, UNIX, Quick Test Professional, SAP HR, SAP Fico, Agile, Scrum.

Confidential, Louisville, KY

Informatica Developer

Responsibilities:

  • Designed the physical model and ETL’s to source data from current systems.
  • Building the necessary staging tables and worktables on oracle development environment.
  • Designed a flexible ETL process to run as Incremental load or Full Load with minimal tweak.
  • Developed Mappings, Sessions and Workflows to extract, validate, and transform data as per the business rules.
  • Developed complex mappings in Informatica to load the data from source files using different transformations like Source Qualifier, look up (connected and unconnected), Expression, Aggregate, Update Strategy, Joiner, Filter, Stored Procedures, Router and Mapplets.
  • Created High level/detailed level design documents and involved in creating ETL functional and technical specification.
  • Worked on Teradata parallel transporter for load operator to load to staging tables and update operator to load to EDW tables.
  • Involved in review of the mappings and enhancements for better optimization of the Informatica mappings, sessions and workflows.
  • Extensively worked in performance tuning of programs, ETL procedures and processes.
  • Worked on data cleansing and standardization using the cleanse functions in Informatica.
  • Involved in implementing the Land Process of loading the customer/product Data Set into Informatica from various source systems.
  • Co-ordinate with the QA team in various testing phases and fix the defects reported by the testing team and coordinate with testing team.

Environment: Informatica Power Center 8.6, Oracle 11g, DB2, SQL Server 2008, Teradata V2R6, Tivoli, HP Quality Center 10, Clear Quest, SQL Developer, SQL Assistant, SSMS,Toad, UNIX, Agile, Scrum.

Confidential, Chicago, IL

Informatica Developer

Responsibilities:

  • Actively involved in understanding business requirements, analysis and design of the ETL process.
  • Used Informatica Power Center to extract, transform and load data from various source systems to staging and target Data warehouse.
  • Used Power Center Workflow Manager for session management, database connection management and scheduling of jobs to be done in batch process.
  • Tuned the performance of mappings by following Informatica best practices and applied several methods to get best performance by decreasing the run time of workflow.
  • Worked with the transformations like Lookup, Joiner, Filter, Aggregator, Source Qualifier, Expression, Update Strategy and Sequence Generator Transformations.
  • Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
  • Modified several of the existing mappings based on the user requirements and maintained existing mappings, sessions and workflows.
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail of the workflow.
  • Extensively worked on Mapping Variables, Mapping Parameters and Session Parameters.
  • Developed PL/SQL procedures, functions, and triggers for processing business logic in the database. Wrote optimized SQL queries for data retrieval from the source database.
  • Worked with PMCMD to interact with Informatica Server from command mode and execute the shells scripts.
  • Monitored the session logs to check the progress of the data load.
  • Involved in different Team review meetings.

Environment: Informatica Power Center 8.1, Oracle 10g, TOAD, UNIX

Confidential, Chicago, IL

Informatica Developer

Responsibilities:

  • Involved in requirements gathering, functional/technical specification, Designing and development of end-to-end ETL process for Sales Data Warehouse.
  • Studied the existing OLTP system(s) and Created facts, dimensions and star schema representation for the data mart.
  • Used Informatica power center for (ETL) extraction, transformation and loading data from heterogeneous source systems.
  • Imported Source/Target Tables from the respective databases and created reusable transformations (Joiner, Routers, Lookups, Rank, Filter, Expression, and Aggregator) in a Mapplet and created new mappings using Designer module of Informatica.
  • Modeled the data warehousing data marts in Star join schema.
  • Created Stored Procedures for data transformation purpose.
  • Created Tasks, Workflows, Sessions to move the data at specific intervals on demand using Workflow Manager and Workflow Monitor.
  • Extensively worked on the performance tuning of the Mappings as well as the sessions.
  • Involved in writing UNIX shell scripts for Informatics ETL tool to run the Sessions
  • Worked with database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.

Environment: Informatica 8.1, Oracle 8i/9i, TOAD, UNIX

We'd love your feedback!