Etl Developer Resume
3.00/5 (Submit Your Rating)
SUMMARY
- 4.9 years of proven ETL Developer and Sustainment Engineer.
- Exposure to ETL tools like IBM Data Stage, SSIS and Optymyze (ETL cloud).
- Experience in working on Oracle, SQL Server and Teradata DB.
- Expertise in UNIX shell scripting, SQL, PL/SQL, Informatica and Teradata production support, sustainment and development.
- Experience in working on Callidus ETL and Data Integration.
- Extensively worked on data extraction, Transformation and loading from various sources like Oracle, Flat files etc.
- Solid Expertise in Oracle Stored Procedures, Triggers, Index, Table Partitions and experienced in Loading data like Flat Files, XML Files, Oracle into Data Warehouse/Data Marts using Informatica.
- Proficient in coding of optimized Teradata batch processing scripts for data transformation, aggregation and load using BTEQ.
- Expertise in RDBMS, database Normalization and Denormalization concepts and principles.
- Strong experience in Creating Database Objects such as Tables, Views, Functions, Stored Procedures, Indexes, Triggers, Cursors in Teradata.
- Strong skills in debugging Teradata utilities like Fast Load, Fast Export, MultiLoad and Tpump for Teradata ETL processing huge volumes of data throughput.
- Sound Knowledge of Data Warehousing concepts, E - R model&Dimensional modeling(3NF) like Star Schema, Snowflake Schema and database architecture for OLTP and OLAP applications, Data Analysis and ETL processes.
- Sound knowledge and hands on experience in ETL concepts and logics.
- Strong experience with ETL tool Informatica 8.5/9.6.
- Created mapping documents,work flows and data dictionaries.
- Good knowledge of Data Warehouse concepts and principles - Star Schema, Snowflake, SCD, Surrogate Keys, Normalization/ De-normalization.
- Quick adaptability to new technologies and zeal to improve technical skills.
- Good analytical, programming, problem solving and troubleshooting skills.
- Basic knowledge on Python programming for data analysis.
- Basic knowledge on Machine learning concepts and Data engineering.
- Basic knowledge on Data structures and Data models.
- Good knowledge on the Data flow from raw form till the form in Operations perspective.
TECHNICAL SKILLS
Operating System: UNIX,LINUX, Microsoft Office 2003/XP/2007/2016.
Database: Teradata, Oracle, Vertica, MS SQL.
Languages: SQL, Teradata SQL, basic knowledge on VSQL, C,Java, PL/SQL,.
Web Technology: Basic knowledge on HTML, J2EE.
ETL Tools/Techs: Informatica 8.5/9.6.1., SSIS(basic knowledge)
Schedulers: Tiwoli workload scheduler (UNIX and LINUX).
PROFESSIONAL EXPERIENCE
Confidential
ETL DeveloperResponsibilities:
- As an ETL SME, gathering requirements from client as per the business.
- Analysing the requirements.
- Migration of the objects of application, Callidus DLM to cloud based Optymyze DLM.
- Migrating ETL(Informatica based) application to cloud based ETL application.
- Re-creating all the transformations with the added logics as per business requirements.
- Developing and deploying new codes into production.
- Working on automating some of the processes to reduce manual work using shell scripting.
- Developing ETL objects using Informatica to maintain dealer details in Optymyze.
- Working on Triggers and Stored procedures to generate continuous alerts.
- Migrating part of the previous data into new environment using SQL/PLSQL blocks.
- Testing the developed scripts for given inputs and retrieved results.
- Developing, Testing and deploying into production environment.
- Monitoring the environment 24*7.
- Fixing the productions issues.
- Working on change requests for deploying new/updated code.
- Providing permanent fixes for production issues.
- Working on client reports.
- Building Transformation logics for Incentive and payout calculations.
- Building complex transformations as per the requirements and making sure huge amounts of data is being loaded into cloud.
- Strong usage of stats for payout calculations of the end user.
Confidential
ETL Developer
Responsibilities:
- As an ETL consultant, we integrate data from different sources including DWH applications.
- We create ETL objects using Informatica as per the business requirements.
- We create unix shell scripts which have Informatica runtime configurations to load data into stage tables.
- We create/run pipelines in callidus to load data into Callidus “Truecomp DB”.
- After data is loaded into “Truecomp DB”, using the Truecomp Manager and with set of rules incentives for dealers is calculated.
- After calculating the dealer incentives, depending on the business indications and different types of dealer’s payouts are calculated for respective months.
- Generate payment reports and different types of reports and publish to respective teams.
- Developing and deploying new codes into production as per changes and requirements.
- Working on automating some of the processes to reduce manual work using shell scripting.
- Usage of Triggers and Stored procedures as per requirements using PL-SQL..
- Developing, Testing and deploying into production environment.
- Monitoring the environment 24*7.
- Fixing the productions issues.
- Working on change requests for deploying new/updated code.
- Providing permanent fixes for production issues.
Confidential
ETL Sustainment/Developer
Responsibilities:
- Monitoring the workflows by using Informatica workflow Monitor, Teradata sessions in the Teradata viewpoint and jobs through the TWS Scheduler.
- Worked on the Teradata stored procedures and functions to confirm the data and have load it on the table.
- Developed procedures to populate the customer data warehouse with transaction data, cycle and monthly summary data, and historical data.
- Worked on optimizing and tuning the Teradata views and SQL’s to improve the performance of batch and response time of data for users.
- Worked closely with analysts to come up with detailed solution approach design documents.
- Proposing and promoting query and design changes as per the production requirement.
- Identifying the root cause of the issues and providing a permanent solution.
- Analysing the data flow and identifying the root cause to the data issue and providing fix.
- Working on client requests such as data correction and data validation on priority.
- Supporting the applications during any outage window.
- Working on Change Request's as per the business requirement.
- Checking upstream and downstream systems availability and sending timely updates.
- Rigorous check on the files availability Confidential landing directory/staging area.
- Informing the Business for any delay of data and production outage.
- Preparing the reports based on the business request on Hourly, Daily, weekly, Monthly and Ad-hoc basis.
- Automation for monitoring TWS objects failures.
- Cleaning of the file system disk space before reaching the mount point size 100% on Unix and Linux servers.
- Continuous check on the performance of Application servers, source DB and Target DB's.
- Leading the team and helping them in problem solving as a duty manager.
- Hosting the Internal and client calls.