Sr Etl Informatica Consultant/data Analyst Resume
San Diego, CA
SUMMARY
- Result oriented Business Intelligence(BI) professional, motivated individual continuously seeking challenging roles with over8+years of experience in requirement gathering, data analysis, design, development and expertise in Data warehousing solutions usingInformatica Power Center/IDQ/IBM DataStage.
- Involved in building Enterprise Data Warehouses (EDW), Operational Data Store (ODS), Data Marts using Data modeling tool ERWIN and Dimensional modeling techniques (Kimball and Inmon), Star and Snowflake schema addressing Slowly Changing Dimensions (SCDs).
- Implemented best practices regarding data staging data cleansing and data transformation routines within the Informatica MDM solution.
- Worked with Informatica Client Tools - Designer, Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Manager and Workflow Monitor.
- Debugging, monitoring, and troubleshooting ETL interfaces and database jobs/queries.
- Strong knowledge in all phases ofSoftware Development Life Cycle(SDLC) with experience in Educational, Financial, Manufacturing, Insurance and Telecom domains.
- Developed complex Mappings using Informatica Power Center Transformations - Lookup, Filter, Expression, Router, Joiner, Update Strategy, Aggregator, Stored Procedure, Sorter, Sequence Generator and Slowly Changing Dimensions.
- Experience in developing complexMapplets, Workletsand re-usable Tasks, re-usable mappings.
- Worked onSQL, PL/SQL Stored procedures, Functions, Packages, Triggers, Cursors.
- Proficient inUNIX shell scripting, Perl scripting.
- Working experience in using Informatica Workflow Manager to create Sessions, Batches and schedule workflows and Worklets.
- Expert knowledge inTrouble shooting andperformance tuningat various levels such as source, mapping, target and sessions.
- Experience in handlinginitial (i.e. history) and incrementalloads in target database using mapping variables.
- Knowledge on business Intelligence reporting tool, Business Objects 6.5/6.0 andCognosas Report Net Report\Query Studio, Framework Manager.
- An excellent team member with an ability to perform individually with strong verbal, interpersonal and written communication skills.
TECHNICAL SKILLS
Data Warehousing: Informatica Power Center 9.6.1/ 9.5.1/9.1.0 /8.6.1 , Power Exchange, Informatica Data Quality(IDQ)/MDM 9.5/9.6, Hadoop 2.2, IBM Web Sphere Data Stage 9.1/8.5, Data Mining, Analytical and Transactional systems
Dimensional Data Modeling: Dimensional Data Modeling, Data Modeling, Star Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, Erwin 4.0/3.5, Oracle Designer
Business Intelligence: Business Objects 6.5, Cognos Series 7.0, SQL Server Reporting and Analysis, Web focus
Programming: Unix Shell Scripting, HiveQL, SQL, PL/SQL, ANSI SQL, Transact SQL, SQL * Loader, Excel
Databases/Tools: Hive 1.2.1/2.1, HDFS, Oracle 10g/9i/8i, Netezza, XML, SQL Server, SAP
PROFESSIONAL EXPERIENCE
Confidential, San Diego, CA
Sr ETL Informatica Consultant/Data Analyst
Responsibilities:
- Worked with ETL Architect/ Data modelers to design new ETL for jobs with performance issues.
- Used SAP Exchange for Informatica to integrate and get real time source data using SAP ALE IDoc Reader Domain transformation.
- Worked withInformaticaData Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.
- Worked on Informatica IDQ mappings to complete data profiling and matching/removing duplicate data.
- Dealt with General ledger/finance and product side of the data and reports to fix the amounts and get right annual profits for business users.
- Used Informatica SAP extractors for logistics and finance to be run 24/7 to get OLTP data.
- Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
- Used Hierarchies tool for configuring entity base objects, entity types, relationship base objects, relationship types, profiles, put and display packages and used the entity types as subject areas in IDD.
- Defined the Trust and Validation rules and setting up the match/merge rule sets to get the right master records.
- Configured match rule set property by enabling search by rules in MDM according to Business Rules.
- Involved in creating, monitoring, modifying, & communicating the project plan with other team members.
- Performed match/merge and ran match rules to check the effectiveness of MDM process on data.
- Working on analyzing performance of individual transformations and optimizing the performance for both Informatica jobs and Databases where applicable.
- Used JIRA Studio for bug tracking, issue tacking and content management.
- Worked on JIRA Agile for Agile planning and management and Integration as its flexible to each user to plan sprints, change and release.
- Worked on Confidential website customer and product data updates which is populated in cloud using AWS RedShift.
- Used Skybot scheduling tool to run the Individual/Member jobs or Job suites to run as a batch. Member jobs are used to not miss any data dependencies between dim and fact tables.
Environment: Informatica Power Center/MDM 9.6.1, ABAP Extractor/SAP Exchange for Informatica, Informatica IDQ, XML,AWS RedShift, Wiki, JIRA Studio/Agile, Cognos Report Studio 10.2.2.0,SkyBot,PL/SQL,SQL Server 2012,StarTeam,Microsoft SQL Server Management Studio 11.0, flat files.
Confidential, Brea, CA
Sr BI Informatica Developer/Analyst
Responsibilities:
- Involved in understanding business processes and coordinated with business analysts and users to get specific requirements to build Data Marts as per ETL standards.
- Worked on developing Functional design, Detailed level designs and Mapping documents as per the business requirements.
- Co-ordinated with teams onsite at different locations across US and off-shore to get all the pieces of code and data required and made sure all QA/Production release is done as per the project plan.
- Performed Impact analysis on the whole ware house data model when major changes are implemented to improve performance of Fact/Dim jobs.
- Developing complex Mappings using Informatica Power Center Transformations - Lookup, Filter, Expression, Router, Joiner, Update Strategy, Aggregator, Stored Procedure, Sorter, Sequence Generator and Slowly Changing Dimensions.
- Creation of client specific linux perl modules that stage/parse raw client data from a variety of formats and render documents into postscript and/or pdf formats.
- Creation of stand-alone, browser based and command line, perl utilities that run on windows and linux.
- Involved in writing NZSQL scripts to load fact and dimension tables in Netezza using Aginity Workbench.
- Used unix scripts to create parameter files and workflows in Informatica to load data from Audit layer to Datamarts.
- Integrated Hadoop into traditional ETL, accelerating the extraction, transformation, and loading of massive structured and unstructured data.
- Hands on experience with installing Hadoop cluster (using command line interface) consisting of multiple nodes and maintaining it on AWS EC2 server both using Windows and Unix as local OS.
- Loaded unstructured data into Hadoop File System (HDFS).
- Have hands on experience with writing HiveQL (Hive SQL) statements using commands like Create, Drop, Alter, Describe, Truncate and Join, Aggregate grouping functions.
- Submitted HIVEQL statements to be converted into Spark jobs and to run in Informatica Hadoop for execution.
- Wrote unix shell/kshell scripts to search a big flat file using commands like grep, tar for image manipulation. Involved in shell scripting for required file manipulations/sending email notifications once user received the required reports.
- Successfully managed in setting up the data and configured the components needed by Hierarchy Manager for MDM HUB implementation which included in implementing hierarchies, relationships types, packages and profiles by using hierarchies’ tool in model workbench.
- Involved with Informatica team members in Designing, document and configure the Informatica MDM Hub to support loading, cleansing, matching, merging, and publication of MDM data.
- Used Metadata manager for validating, promoting, importing and exporting repositories from development environment to testing environment.
- Involved with Data Steward Team for designing, documenting and configuring Informatica Data Director for supporting management of MDM data.
- Documented the issues and actions taken related to data cleansing using cleanse lists and predefined cleanse functions.
- Use Active batch to schedule and monitor jobs.
- Responsible for unit, system and integration testing. Development test scripts, test plan and test data. Participated in UAT (User Acceptance Testing)
Confidential, Naperville, IL
Senior ETL Production Support Engineer
Responsibilities:
- Adopted existing UNIX directories and scripts and modified as per required data process and for better performance and enhancements.
- Developed and implemented the new flow of Component trades which includes loading the feed file in Perl/Unix shell, applying the business rules through stored procedures and generating the report in Perl for Adhoc reports.
- Re-designed ETL jobs for performance issues as part of Production Support on-call rotation basis.
- Created fix workflows to load data after a failure in job due to data issue as part of Production Support.
- Developed custom item file jobs to provide with special price on the products to specific customers with bulk orders all around the year.
- Identified and eliminated duplicates in datasets thorough IDQ 8.6.1 components of Edit Distance, Mixed Field matcher. It enables the creation of a single view of customers, help control costs associated with mailing lists by preventing multiple pieces of mail.
- Designed and developed the DataStage server as well as parallel jobs for extracting, cleansing, transforming, integrating and loading data using DataStage designer.
- Created UNIX shell scripts to utilize the DataStage Parallel engine to handle large volumes of data.
- Used HP Quality center for Incident Management to keep track of ongoing major L1 and L2 tickets and played a key role in closing many critical issues/bugs in the high profit/sale season for the company.
- Used UC4 to schedule and monitor jobs.
- Used Quality center for tracking and resolving defects.
Environment: Informatica Power center/IDQ 8.6.1, IBM Info sphere DataStage 8.5, PL/SQL, Perl Scripting, Oracle 10g/11g, SQL Server, Toad for Oracle 10.0, IT Remedy, flat files, Business Objects enterprise 11.5, Paraccel, UC4 scheduler.
Confidential, Monroe, LA
ETL Informatica Developer/Business Analyst
Responsibilities:
- Worked on migration of data from Confidential system to Qwest server as a part of merging of the two companies.
- Involved in Requirement Analysis, ETL Design and Development for extracting from the source systems and loading it into the Warehouse and DataMarts.
- Analyzed, designed, developed, implemented and maintained moderate to complex initial and incremental load mappings to provide data for enterprise data warehouse.
- Wrote UNIX scripts to run and schedule workflows on Production server for the daily runs.
- Used transformations like Source Qualifier, Expression, Filter, Lookup transformations for transformation of the data and loading into the targets.
- Wrote complex SQL scripts to avoid Informatica joiners and look-ups to improve the performance where data volumes were heavy.
- Automated the Informatica process to update a control table in database when maps are run successfully.
- Developed PL/SQL procedures/packages to kick off the SQL Loader control files/procedures to load the data into Oracle.
- Defined file provisioning process (DW Preprocessing Steps). 100% automation of file provisioning process using UNIX, Informatica mappings and oracle utilities.
- Solely responsible for the daily loads, handling the reject data and re-loading the fixed records.
Environment: Informatica Power Center 9.1/8.6.1/8.6.0 , Teradata SQL Assistant 12.0, Oracle PL/SQL developer 8.0, Oracle 10g, SQL Plus, Unix shell scripts, Web focus, Putty.
Confidential, Reston, VA
ETL Informatica Developer
Responsibilities:
- Requirement gathering and Analysis of the specifications provided by the clients and updating Report Descriptions, Software Requirements Specification documents.
- Worked on Informatica tool - Source Analyzer, Data warehousing designer, Mapping Designer & Mapplets, and Transformations.
- Developed mappings using reusable transformations.
- Created workflows with event wait task to make sure all prerequisites are met for each job in the flow.
- Modified UNIX scripts to monitor systems and automation of daily tasks and customer requests.
- Extensively used Debugger Process to test Data and unit testing.
- Loaded data from various sources (Flat files, Oracle, SQL Server, XML) using different Transformations like Source Qualifier, Joiner, Router, Sorter, Aggregator, Connected and Unconnected Lookup, Expression, Sequence Generator, Union and Update Strategy to load the data into the target.
- Used persistent/static/dynamic cache for better throughput of sessions containing Lookup, Joiner, and Aggregator and Rank transformations.
- Used Incremental Aggregation technique for better performance of aggregator transformation.
- Created mappings for populating data to dimensions and fact tables with huge volumes for history and daily loads separately.
- Provided Production Support for business users and documented problems and solutions for running the workflow.
- Moving the mappings and workflows to staging and then production environment and testing the process at every level.
- Used IBM clear case and clear quest for Incident management to keep track of open and critical tickets.
Environment: Informatica Power Center 8.6.1, Power Exchange 8.6.1, Oracle 11g/10g, IBM Clear Case/Clear quest, Cognos 8, XML, PL/SQL, Toad, Unix Korn-shell scripts, Perl scripts, Erwin, Linux.
Confidential, Irving, TX
ETL Developer
Responsibilities:
- Participated from the initial Data Warehouse build phase which includes logical and physical modelling.
- Handled various types of sources like flat files, Oracle, SQL Server.
- Extracted SAP HR data into Informatica using SAP power connect and application source qualifier to get data from SAP system into source analyzer.
- Worked extensively with complex mappings using transformations like update strategy, expression, aggregator, stored procedure, filter, lookup.
- Created reusable transformations and mapplets in the designer using transformation developer and mapplet designer tools.
- Used workflow monitor to monitor tasks, workflows and also to monitor performance using collect stats.
- Worked on writing and tuning SQL and PL/SQL Procedures.
- Worked on UNIX shell Scripts.
- Appropriate tuning was performed to improve performance and provide maximum efficiency for ETL Jobs and Oracle database level scripts.
Environment: Informatica Power Center 8.6.1/8.5/7.1 , Power connect 8.6.1, Oracle 11g/10g, SAP, MS SQL Server 2005, XML, PL/SQL, Toad 8.1, Unix shell scripts, MS Visio 2007.