Etl Consultant Resume
CAREER SUMMARY:
Senior Informatica developer with 9 years of experience in performing ETL designing for data warehousing systems using Informatica tools and PL/SQL programming. Capable of analyzing business requirements and database models, developing technical documents, creating mappings, sessions and workflows using Informatica PowerCenter tools as well as providing troubleshooting and production support for various Informatica systems. Hands - on experience working on data visualizations and dashboards using Tableau Desktop .
SUMMARY OF SKILLS:
- Thorough knowledge of Informatica based system development, ETL design, PowerCenter Data integration and Quality, software design and development principles, data mapping and migration, data warehousing concepts, data replication and synchronization, SQL queries and stored procedures, Informatica data cleanup procedures, error handling and performance tuning, data analysis and debugging, test-driven environment
- Working knowledge of ETL tools such as Informatica Power Center 8.6 and 9.x and IBM Data Stage, Windows and Unix OS
- Clear understanding of SQL, PL/SQL, Oracle/ DB2 databases, Teradata and Greenplum.
- Experience using Control-M scheduling tool and extend production support providing quick turnaround to meet SLA’s
- Experience using data visualization tools like Tableau and creating Dashboards for reporting
- Expertise in Unix, SQL, C# scripting
- Worked in Agile, Waterfall model and Release-wise development cycles
- Experience working with multi-domains such as Banking, Finance, Insurance, Automobile and Manufacturing
WORK EXPERIENCE:
ETL Consultant
Confidential
Responsibilities:
- Design and develop (enhancements and maintenance) new ETL Solutions
- Improve slow running jobs with the help of redesign and better ETL processes to meet business needs
- Support the team to design, model, develop, and maintain existing and new SQL databases objects required for all business solutions
Senior Programmer Analyst
Confidential
Responsibilities:
- Business requirement gathering and creating design documents.
- Responsible for life cycle of the Project including System Analysis and Design, Implementing, Developing mappings and test cases.
- Analyze dependencies on existing system if any and come up with an execution plan.
- Work with Data modelling and Database Admin teams to setup environment required for build and execution.
- Responsible for source code management, control and module integration.
- Reporting progress/issues for overall development to the management team.
- Attend meetings with Data modelling and database teams to work on tables and schema creations to match requirements.
- Design mappings and test integration with existing code.
- Create test bed for unit testing of ETL mappings and Functions.
- Testing for technical defects by creating test cases.
- Improve performance of slow running jobs with the help of redesign and better ETL processes to meet business needs
- Schedule jobs and batches using Control M and monitor and ensure smooth production flow.
- Effective inter group co-ordination among multi vendors.
- Working on databases to create intermediate tables and stage tables for data fetching.
- Create visualizations for risk reporting using Tableau Desktop/Server - connecting to data marts
- Communication with business partners on tactical reports to convert to Tableau.
Software Engineer
Confidential
Responsibilities:
- The project is a migration project. Source, target systems are identified in existing Stored procedures through reverse engineering technique
- Understand the business functionality of the interface
- Prepare low level design documents to frame the ETL mappings as per the requirements
- Identifying source and target systems and transformations logic involved in existing system
- Creating a mapping specification document to map old and new system components and attributes.
- Working on oracle SQL DB for fetching and loading data
- ETL mappings are developed using informatica Power Center to migrate data from existing systems to new system without effecting the business
- Testing the mappings with various scenarios and produce test cases and results
- Effective inter group co-ordination among multi vendors
- Analyze, gather, define and document all application requirements as specified by Business Analysts.
- Create jobs as per the requirement of Business Analysts.
- Generate vehicle codes and VIN numbers for the newly produced GM vehicles and its parts
- Maintain details of vehicles and customers related to General Motors in data marts by loading and processing the data using ETL tool IBM DataStage.
- Test for technical defects by creating test cases and capturing test results.
- Involve in IQA and EQA to produce defect free code.
Confidential
Software EngineerResponsibilities:
- Creation of mappings as per the requirement of Business Analysts.
- Create appropriate documentation for business requirements and workflow.
- Review, repair and modify programs to ensure technical accuracy, security and reliability of programs
- Baseline key metrics, measure performance from time-to-time and enhance products, troubleshoot for key problem areas and upgrade where necessary.
- Create Informatica mappings to load database
- Perform test plan preparation and unit and system testing
- Perform code review and bug fixing.
- Performance tuning of mappings by identifying the bottlenecks.
- Effective inter group co-ordination among multi vendors.
- Working on databases Teradata and DB2 for fetching and loading data.
- Production Support using IBM Prisma Tool (SMILE).
- Worked in Multi-vendor system and coordinated with various teams effectively to achieve high performance.
- Involve in Creation and Execution of Test Cases, and Test Scenarios.
- Involve in Detailed Integration testing of the data coming from various source systems which includes the tasks:
- Record Count Verification against source and target as a basic check
- Validation of source data
- Data integrity between the source tables and relationships
- Validation of calculated fields
- Check for missing data and Key columns Null values.
- Field-by-Field data verification to check the consistency of source and target data.