Sr. Etl Developer/analyst Resume
Auburn Hills, MI
PROFESSIONAL SUMMARY:
- Around 6 plus years of experience in IT in the areas of analysis, design, development and testing in client server environment with focus on Data warehousing applications using ETL/OLAP tools like Informatica, Business Objects with Oracle, SQL Server databases.
- Experience in business requirements gathering, define and capture metadata for business rules, system analysis, design, development, production support, testing, user, and strategic decision making, problem solving associated with the ETL processes by working closely with BI team.
- Implemented Data Warehousing, Data Marts, Dimensional Modeling, Star/ Snow Flake Schema Modeling, Entity - Relationship Modeling, OLAP and OLTP concepts.
- Implemented Change Data Capture/Incremental Loading and slowly-changing Dimensions.
- Extensive experience in extraction, transform and loading of data directly from different heterogeneous source systems like Flat files (Fixed width & Delimited), COBOL files, VSAM, IBM DB2, Excel, Oracle, MS SQL Server, Teradata, DB2.
- Familiarity with PL/SQL Functions, Cursors, Packages, Views, Indexes and extensive experience in writing SQL queries and Stored Procedures.
- Experience in working closely with the project management to ensure timely delivery of solutions and learning business concepts quickly and relate them to specific project needs.
- Created Design and Mapping Documents after interacting with business analysts to clarify requirements.
- Experience in Performance Tuning. Identified and fixed bottlenecks and tuned the complex Informatica mappings for better Performance.
- Experience in writing complex sub queries, PL/SQL programs (functions, procedures, packages), stored procedures, and shell scripting to run pre session and post session commands.
- Expertise in using GUI tolls like Erwin, Toad, SQL Developer, SQL Navigator and SQL *Loader.
- Proficiency in developing standards and re-usable logics using ETL Informatica tool.
- Strong Knowledge of applying various transformations like Aggregator, Joiner, Connected and Un Connected lookup, SQL transformation, Un Connected Store Procedure, Normalizer and XML parser in developing code to Integrate, Archive data as desired by the organizations.
- Created shortcuts for source and target instances and mappings in specified folders.
- Adapted performance tuning techniques to optimize ETL processes.
- Experience in Validating ETL Tools like Informatica Power Center.
- Experience in UNIX shell scripting to locate files on server, make record count and generate parameter files and emails.
- Collaborated with admin team to migrate code to different environments.
- Involved in production support, resolving the production job failures, interacting with the operations support group for resuming the failed jobs.
- Familiarity with V and Agile environments.
TECHNICAL SKILLS:
ETL Tools: InformaticaPower Center 9.5/9.1/8.6/7. x,SalesForce,InformaticaCloud,InformaticaPower Exchange 5.1/4.7/1.7, Power Analyzer 3.5, Informatica Power Connect and Metadata Manager.
Databases: Oracle10g/9i/8i/8.0/7.x,Teradata13,DB2 UDB 8.1, MS SQLServer 2008/2005, Netezaa 4.0 and Sybase ASE 12.5.3/15.
Operating Systems: UNIX (Sun-Solaris, HP-UX), Windows NT/XP/Vista, MSDOS
ProgrammingSQL, SQL: Plus, PL/SQL, UNIX Shell Scripting
Reporting Tools: Business ObjectsXIR 2/6.5/5.0/5.1, Cognos Impromptu 7.0/6.0/5.0,Informatica Analytics Delivery Platform, MicroStrategy.
Modeling Tools: Erwin 4.1 and MS Visio
Other Tools: SQL Navigator, Rapid SQL for DB2, Quest Toad for Oracle, SQL Developer 1.5.1, Autosys, Telnet, MS SharePoint, Mercury Quality center, Tivoli Job Scheduling Console,JIRA
PROFESSIONAL EXPERIENCE:
Confidential, Auburn Hills, MI
Sr. ETL Developer/Analyst
Responsibilities:
- Worked with business team and architects to translate business requirement into tech design.
- Worked on Power Center client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations Developer.
- Involved in Performance tuning for better performance.
- Participated in weekly end-user meetings to discuss data quality, performance issues and ways to improve data accuracy and new requirements, etc.
- Participated in Test Case walkthroughs, inspection meetings, Interacted and worked with the Development and Data warehouse team to solve the problems encountered in test execution.
- Participated in review meetings with Dev team to meet the testing targets, Participated in daily status meetings to report any bugs, issues and risks.
- Involved in Unit, System integration, User Acceptance Testing of Mappings. Prepared complex SQL override scripts at source qualifier level to avoid Informatica joiners and Look-ups to improve the performance as the volume of the data was heavy.
- Responsible for the developing ETL process/Programs with Informatica power center to extract data from client’s operational databases Transform and filter the data and load it into target data base.
- Responsible for Creating workflows and worklets. Created Session, Event, Command, and Control Decision and Email tasks in Workflow Manager.
- Provided application support to the data warehouse. Especially to the portion of the data warehouse which I am responsible for.
- Created Perl scripts to generate Email Notifications and to load data from different Source systems like flat files and Oracle databases (ETL) and to schedule workflows.
- Created high-level Technical Design Document and Unit Test Plans.
- Documented Informatica mappings, design and validation rules.
Environment: Informatica Power Center 9.5.1, Oracle 11g/10g, TOAD, Business Objects XI3.5, Solaris 11/10, Teradata, clear case, Tivoli Job Scheduler
Confidential, Omaha, NE
Sr. Datawarehouse Developer
Responsibilities:
- Involved in Analysis, Requirement Gathering with business users and Design process.
- Extracted data from flat files, VSAM, XML and RDBMS systems like Oracle, Netezza, SQL server and DB2 systems.
- Developed number of complex Informatica mappings and mapplets to implement business logic.
- Developed Informatica mappings by usage of Aggregator, SQL Overrides in Lookups, filter in Source Qualifier, and data flow management into multiple targets using Router transformations.
- Provided Production support for the production issues by troubleshooting the session logs, bad files, tables and mappings.
- Data is migrated from Oracle database to Teradata database using Informatica and Teradata utilities like BTEQ.
- Developed the Informatica scripts in order to populate the Global repository from the local repository Worked with Scheduler to run the Informatica session on daily basis and to send an email after the completion of loading Conducted Unit testing and Integration testing Analyzed the reliability, hypothesis and restriction in documentation of the Functional, Technical and Architecture which are confidential.
Environment: Informatica Power Center 9.1.0/8.6.1 Oracle 10g, Teradata, SQL Server 2008, Toad, SQL Plus, SQL Query Analyzer, SQL Developer, MS Access, Windows NT, Shell Scripting, Clear Quest, Tivoli Job Scheduler
Confidential
ETL Developer (BI Project)
Responsibilities:
- Participated in all phases including requirement gathering, analysis, design and development.
- Interacted with IT management, SMEs, Architects, Project Manager and Customers to understand the business requirements and to translate them into corresponding technical documents.
- Managed data warehouse enhancements and related projects, defined requirements and ensured that work is completed accurately and on-time.
- Prepared Technical Design Document for the requirements including key project assumptions technical flow, process flow, ETL Logical design, logics and Error handling
- Worked extensively on Mappings, Mapplets, Sessions and Workflows.
- Being a part of ETL team worked extensively and used bulk load mechanism to load the huge amount of data into the Datamart (Aprrox 5 billion /week)
- Prepared Scripts to set the environments to run the Informatica workflows in an UNIX server to synchronize the source systems.
- Designed and Created ETL Design Documents and Process Flow Diagrams.
- Developed ETL Objects - Mappings, Sessions, Workflows, Scheduling, Deployment planningetc for Common Data Storage Warehouse and Marts.
- Responsible for Migration and Deployment of Code to Test/UAT and PROD Environments and Scheduling the Jobs.
- Support Daily Operations and Fix Production Issues (HOT Fixes).
- Performing the basic Informatica Administrator activities.
- Prepared JIL code (Job information language) to schedule the jobs in Autosys.
- Worked closely with reporting team and resolved the performance issues.
- Extensively used ETL to load data using Power Center from source systems like Flat Files and Excel Files into staging tables and load the data into the target database Oracle. Analyzed the existing systems and made a Feasibility Study.
- Worked on ticketing system called BMC Remedy user to receive the tickets for job failures.
- Prepared complex SQL override scripts at source qualifier level to avoid Informatica joiners and Look-ups to improve the performance as the volume of the data was heavy.
- Did Performance Tuning - Identified and fixed bottlenecks and tuned the complex Informatica mappings for better Performance.
- Used existing procedures (Helper Package) in coding to build the indexes and gather statistics at the table level which is a part of pre load and post load utility.
- Reviewed test cases written by team members.
Environment: Informatica Power Center 9.1/9.5 (Repository Manager, Designer, Workflow Manager, Workflow Monitor) Oracle 11g, SQL Developer, Flat Files, Autosys, AIX, Exadata, SQL Server 2008 Windows NT, Shell Scripting, BMC Remedy user, Micro Strategy
Confidential, Cleveland, OH
Sr.ETL Developer
Responsibilities:
- Involved in requirement gathering, analysis and designing technical specifications for the data migration according to the Business requirement.
- Created extensive documentation on development, implementation, daily loads and process flow of the mappings.
- Implemented incremental loading to load data which has been refreshed on daily basis.
- Experience with high volume datasets from various sources like Text Files and Relational Tables.
- Implemented Slowly Changing Dimensions.
- Implemented target load order to load data to maintain referential integrity.
- Used FTP to place source files on the specified directory and run the jobs.
- Developed mappings to load tab delimited files and send them to third party site by using SFTP.
- Developed mappings using aggregator by passing sorted input and remove the duplicate records.
- Created store procedures to check the existing of partition, drop and recreate the partition.
- Monitored cache usage by reviewing session logs and implemented performance tuning techniques to reduce the job completion time.
- Created parameter files to standardize database connection names and use parameters in overrides for constantly changing the account types.
Environment: Informatica 8.1/8.6,Oracle10g,PL/SQL,OBIEE,WINDOWS,CA-7,UNIX
Confidential, Dearborn, MI
Sr. ETL Developer
Responsibilities:
- Documented Informatica mappings, design and validation rules.
- Developed different mapping logic using various transformations to extract data from different sources like flat files, IBM MQ series, Oracle
- Created Sessions and Workflows to load data from the IBM DB2 UDB 8 databases that were hosted on HP UX 11i RISC server
- Developed mapping logic using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, Sorter, Update strategy and Sequence generator.
- Actively implemented Informatica performance tuning by identifying and removing the bottlenecks and optimized session performance by tuning complex mappings
- Extensively involved in unit testing, integration testing and system testing of the mappings and writing Unit and System Test Plan.
- Migrated objects from the development environment to the QA/testing environment to facilitate the testing of all objects developed and check their consistency end to end on the new environment.
- Involved in production support, resolving the production job failures, interacting with the operations support group for resuming the failed jobs.
Environment: Informatica Power Center 8.1, Autosys 4.5, HP-UNIX 11i, Oracle 10g/ 9i, IBM DB2 8.5, MQ Series, VSAM files UDB 8.