Etl Team Lead Resume
0/5 (Submit Your Rating)
Jacksonville, FL
SUMMARY
- Around 10 Years of IT experience with professional expertise in Data Integration/ Data warehousing Systems.
- Over 7 years of ETL tool experience using Ab Initio, Datastage in banking domain.
- Strong understanding of the Data Warehousing Techniques and ETL Methodologies for supporting data extraction, loading and transformation processing.
- Well versed with AB Initio parallelism techniques and implemented Ab Initio Graphs using Data parallelism and MFS techniques.
- Well versed with various Ab Initio components such as Join, Rollup, Partition by key and sort, gather, merge.
- Good experience with Data Migration, Data Transformation and Data Loading using Ab Initio into Oracle, and flat files.
- Excellent experience in designing and developing Ab Initio graphs using GDE environment.
- Expertise in Developing Transformations between Source and Target using Ab Initio.
- Worked on Ab Initio production job scheduling process using the Autosys tool.
- Experience on testing Ab Initio applications in different environments.
- Extensive database experience using SQL, Oracle, Teradata.
- Strong in writing UNIX shell scripts for Data validations, Data cleansing etc.
- Casestudy on building analyses & dashboards using OBIEE.
- Hands on experience working in USA & UK IT Industry.
TECHNICAL SKILLS
ETL: Ab Initio, Datastage.
BI: OBIEE.
Languages: SQL, UNIX Shell Scripting, Autosys
Databases: Teradata 12, ORACLE 11G
Scheduling Tools: Autosys, CA7
SQL Tools: Teradata SQL Assistant, Toad for Oracle.
PROFESSIONAL EXPERIENCE
Confidential, Jacksonville, FL
ETL Team Lead
Responsibilities:
- Review the functional documentation and curve functional documents into technical design documents.
- Developed, tested and reviewed complex Ab Initio graphs, sub - graphs, DML, XFR, deployed scripts, DBC files for connectivity.
- Extracted data from various sources performed data transformations and loaded into the target interface files and tables.
- Improved performance of Ab Initio graphs by using various Ab Initio performance techniques like using in memory joins and rollups to speed up various Ab Initio graphs and avoiding deprecated components.
- Developed, tested and reviewed complex Ab Initio graphs, sub-graphs, DML, Pset, XFR, deployed scripts, DBC files for connectivity.
- Developed wrapper scripts to periodically notify users in case of any failures with debugging information.
- Extensively worked with the Ab Initio Enterprise Meta Environment (EME) to obtain the initial setup variables and maintaining version control during the development effort.
- Support System Integration and User acceptance testing cycles executed by the Automation team fixing design issues.
- Wrote UNIX scripts as required for preprocessing steps and to validate input and output data elements.
- Tested Ab Initio applications in Unit, CIT/SIT and UAT environments.
- Involved in migration of code from development to other test environment and to production.
- Involved in change management using Maximo by creating RFCs and IRs to raise defects.
- Provide expert guidance to rest of the team on design and development issues.
Environment: Ab Initio GDE 1.15 CO 2.15,AIX Unix, SQL.
Confidential, Concord, CA
Ab Initio Developer
Responsibilities:
- Experience in using korn Shell Scripting to maximize Ab-Initio parallelism capabilities and developed numerous Ab-Initio Graphs using Data Parallelism and Multi File System (MFS) techniques.
- Used components like run program and run SQL components to run UNIX and SQL commands in Ab-Initio.
- Expertise in unit testing, system testing using of sample data, generate data, manipulate date and verify the functional, data quality, performance of graphs.
- Used different Ab-Initio components like partition by key and sort, dedup, rollup, reformat, join in various graphs.
- Used phasing to sequence plans.
- Perform program construction / modification due to problem fixes and new development.
- Involved in code reviews, performance tuning strategies.
- Conducting various Testing cycles.
- Project discussions to consolidate the current status and requirements.
- Conduct quality reviews of design documents, code and test plans.
Environment: Ab Initio GDE 1.14 CO 2.15, UNIX, SQL.
Confidential
Lead Datastage Analyst
Responsibilities:
- Extensively used IBM Information server Designer to develop various jobs to extract, cleanse, transform, integrate and load data into target tables.
- Prepared technical design/specifications for Data Extraction, Data Transformation, Data Cleansing and Data Loading.
- Used IBM Information server Designer to transform the data through multiple stages and prepared documentation.
- Extracted data from various sources performed Data transformations and loaded into the target Oracle database.
- Import/Export datastage jobs using IBM Information server Designer
- Designed Data Stage Parallel jobs with Change Capture, Change Apply stages.
- Implemented logic for Slowly Changing Dimensions.
- Automated and fine-tuned IBM Information server Designer jobs and sequences for loading source systems data into Data warehouse.
- Extensively wrote user-defined SQL coding for overriding Auto generated SQL query in DataStage.
- Used local and shared containers to increase Object Code Reusability and to increase throughput of the system.
- Used different partitioning methods like Auto, Hash, Same, Entire etc.
- Developed Job sequences with restart ability check points, and implemented proper failure actions.
- Participated in the review of Technical, Business Transformation Requirements Documentation.
- Actively participated in the Team meetings to gather the business requirements and developing the specifications.
- Participated on call support on 24/7 Basis.
Environment: Datastage, Unix, Teradata SQL
Confidential
ETL Developer
Responsibilities:
- Designed ETL jobs for extracting data from heterogeneous source systems and designed the automated job runs to automate the ETL process.
- Involved in designing ETL Processes to extract data from oracle and load into target oracle databases by applying business logic on transformation for data cleansing and insertion of records.
- Loading historical data and daily extraction of data.
- Scheduling of the job using Shell Script and Autosys utility.
- Monitoring the process on a daily basis and coordinating with onsite teams on resolution of issues in the daily pass.
- Designing additional ETL requirements as per the requirement changes by the client.
Environment: Oracle, UNIX.