We provide IT Staff Augmentation Services!

Elt Developer Resume

4.00/5 (Submit Your Rating)

CA

SUMMARY:

  • Over 10 years of IT experience with experience in Analysis, design, development and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ELT, OLAP, Client/Server and Web applications on Windows and Unix platforms.
  • Teradata Certified Professional V2R5.
  • Strong knowledge in Teradata utilities: BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD, Teradata Administrator, SQL Assistant.
  • Strong working experience in Oracle 11g, 10g, 9i & 8i.
  • Strong experience in Oracle queues, Materialized Views, Stored Procs & Triggers.
  • Excellent exposure in Oracle query tuning and performance monitoring using OEM.
  • Experience in working with automating scripts using Autosys, Oracle Scheduler & Crontab.
  • Involved in performance improvement activities, resolving system bottlenecks, performance tuning, Error/Exception handling, scheduling & monitoring jobs in production.
  • Strong experience in Transformation and Loading data of multiple formats into Data Warehouse.
  • Experienced in writing and deploying UNIX Korn Shell Scripts as part of the standard ETL processes and for job automation purposes.
  • Strong working knowledge and skilled in querying, loading data and report generation.
  • Well Experienced with Data modeling tools ERWIN for creating LDM’s and PDM’s.
  • Data Warehouse requirement analysis and development/implementation.
  • Experience in using Incident management tools like IBM Clear Quest & Service Now.
  • Well versed with various aspects of ELT processes used in loading and updating Teradata/Oracle data warehouse.
  • Experience in writing Stored Procedures, Macros and Triggers in Teradata.
  • Extensive experience in Telecom billing & Retail services.
  • Strong experience in building tools/utilities using Java/PHP to automate business processes.
  • Good experience in creating dashboards and other custom built tools for data presentation, using Java/PHP/Flex.
  • Experience in writing complex queries and performance tuning of queries.
  • Experienced with planning, architecture, and design of Teradata data warehousing and SQL optimization
  • Proficient in architectural diagrams, logic flowcharts, data maps, operating, maintenance and support procedures, and detailed documentation of the technical design
  • Extensive analytical, debugging and problem solving skills.
  • Well versed with various aspects of ELT processes used in loading and updating Teradata data warehouse.
  • Excellent experience in finding root cause of issues and Trouble shoot problems in Production systems.
  • Experienced with mentoring Teradata Development teams, data modeling, program code development, test plan development, datasets creation, testing and result documentation, analyzing defects, bug fixing
  • Full SDLC experience including Analysis, Design, Development, Testing and implementation of the system.
  • Team player with good communication skills, written skills, technical documentation skills and also a self - motivated individual with exemplary analytical and problem solving skills.
  • Demonstrate a willingness, interest, and aptitude to learn new technologies and skills

SKILLS SUMMARY:

ETL & OLAP: Teradata TPT

RDBMS: Teradata 12.0/14.0, Oracle 10g/9i/8i

Programming Languages: SQL, PL/SQL, C/C++, JAVA, COBOL, UNIX Shell Scripts, Action Script

Tools & Utilities: TOAD, SQL*Plus, SQL*Loader, ODBC, Adobe Flex Builder 4

Data Modeling Tools: Erwin 4.5/4.0/3.5.2 , MS Visio.

Operating Systems: UNIX, OS-X, Windows 2003/2000/NT

WORK EXPERIENCE:

Confidential, CA

ELT Developer

Responsibilities:

  • UNIX wrapper for BTEQ executing the ELT processes for loading data into Confidential specific tables.
  • FastExport scripts to extract data from Teradata datawarehouse into flatfiles
  • Gather requirements from business users and implement the loading/extraction processes to provide these metrics.
  • Job automation using UC4.
  • Work closely with IR team and Finance team for any questions or data quality issues.
  • Optimize Fexp or Bteq scripts to improve performance.
  • Work with the Enterprise team to ensure data accuracy
  • Involved in project documentation, UTP and migrating code from development to production.
  • Maintaining the GIT code repository and release engineering process using JIRA.
  • Informatica forklift from Oracle data sources
  • Oracle performance tuning through OEM.
  • Worked with tools like TOAD and SQL Developer tools to write queries and generate the results.
  • Maintaining existing scripts for materialized views, triggers, stored procedures.
  • Involved in project documentation, UTP and migrating code from development to production.
  • Interactions with various business users / Source contacts, DBA’s and system administrators for production issues.

Environment: Teradata 14.10, UC4, UNIX, Oracle Confidential, Windows 7, Informatica, Visio, Oracle 11g

Confidential

Product Line Planning (OR)

Responsibilities:

  • Design process oriented UNIX script and ELT processes for loading data into datawarehouse.
  • Job automation using Crontab & Autosys.
  • Worked closely with Project Managers, Business Analysts, source owners, Data Quality team to ensure timely and accurate delivery of business requirements
  • Working with dbas for Teradata performance tuning via Explain, PPI, and AJI, Indices, collect statistics or rewriting of the code.
  • Monitoring and maintaining health of Oracle advance queues.
  • Modify/enhance existing shell scripts and BTEQs based on the requirements.
  • Ensure availability of the database by working with the DBAs.
  • Coded test SQLs and analyzed results and Analyzed test case results and documented test cases/plans
  • Monitoring data quality and integrity end to end testing and reverse engineering and documented existing ELT program codes.
  • Work with the Cognos team to fix any issues related to the display of reports.
  • Worked on release tickets for defects, enhancements, performance tuning, and process optimization activities.
  • Assisted SIT/UAT testers in optimizing testing scripts and to ensure data accuracy and integrity and Provided production support
  • Preparing standardized weekly and monthly reports to engage the top management.

Environment: Teradata 13.0, Autosys, Service Now UNIX, Oracle 11g, Windows 7, PL/SQL, TOAD, ERWIN, SQL Developer, Visio.

Confidential, TX

Data Migration

Responsibilities:

  • Understanding existing business model and customer requirements.
  • Requirements gathering and analysis, Design, Development and Testing.
  • Providing optimal Data warehousing solutions to the Business.
  • Using Teradata loading Historical and Incremental Data into Dimension / Fact tables.
  • Extensively worked in data Extraction, Transformation and Loading from source to target system using BTEQ, Fast Load and MultiLoad.
  • Did data reconciliation across various source systems.
  • Wrote Unix Shell Scripts for batch processing and data loading using Teradata and scheduling jobs.
  • Involved in modifying existing BTEQ’s, FLOAD, MLOAD scripts for better Performance.
  • Involved in identifying system bottlenecks, performance tuning, Error/Exception handling, scheduling and monitoring jobs in production.
  • Answering ADHOC requests from the Business users.
  • Preparation of the Unit test plan and test results along with the test data.
  • Attending Client Meetings on providing the progress details of the tasks.
  • Working with team members in Design, Development & resolving defects and Resolving Issues and problems faced by the team.
  • SME on VASIP related questions for mapping team.
  • Created an in-house business dashboard using Adobe Flex (JAVA backend) where business owners could view real-time graphs and trend reports for sales, statistics and other mission critical information.

Environment: Teradata 13.0, J2EE, JIRA v4.3.2, Web logic Application Server, UNIX, Windows2000

Confidential

ELT Developer

Responsibilities:

  • Loaded data into Teradata using FastLoad, BTEQ, FastExport, MultiLoad, and Korn shell scripts
  • Enhanced the existing LDM and PDM for Billing Datawarehouse.
  • Responsible for Performance Tuning of SQL queries.
  • Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
  • Tested the target data against the source system tables by writing the SQL queries and documented the results in the excel sheet.
  • Coded complex and highly-optimized SQLs looking up at core EDW in Third Normal Form Financial Services Physical Data Model to output at denormalized data mart requirements
  • Worked on Shell scripts to schedule batches on Cron Jobs.
  • Involved in designing ER diagrams, Logical model (FACTS and DIMENSIONS) and Physical database (capacity planning, and aggregation strategies) as per business requirements with Erwin using Star Schema, snowflake schema and 3NF.
  • Populated the Staging areas with various Sources like Flat files (Fixed Width and Delimited), SQL
  • Understanding existing business model and customer requirements.
  • Requirement gathering, analysis, participating in client calls, scrum meeting and design meetings.
  • Involved in preparing project documentation like design, production support, UTP, code review checklist, migration and sprint workbooks for every project.
  • Extracted data from various heterogeneous sources like Flat Files, XML and SQL server 2005/2008.
  • Developed UNIX shell scripts calling reusable Datastage jobs for staging flat files data into staging database.
  • Involved in developing scripts for calling internal warehouse automation process for loading data into Fact and Dimension tables using re-usable Datastage jobs and BTEQ’s.
  • Used Teradata MLOAD and FLOAD for direct loads.
  • Used Fast Export scripts for exporting to flat files.
  • Used TPT for loading data from one or more Teradata table(s) into a Teradata table at development level.
  • Implementing change request and handled production support tickets and service requests.
  • Answering ADHOC requests from the Business users.
  • Responsible for implementing daily loads, weekly loads, monthly loads and Quarterly loads using shell scripts.
  • Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
  • Tuning queries, scripts by identifying indexes, strategies etc.
  • Implementing steps for eliminating duplicates, spool space issues.
  • Creating stage tables, indexes, according to the client instructions.
  • Scheduling cron jobs using cron tabs.
  • Maintaining code versioning using versioning tools IBM clear case v7.0.1 and incident management tool IBM clear quest.
  • Writing unit test cases, submitting unit test results and migrating Code from development to SIT (System Integration Testing) and from SIT to production by creating Migration documents.
  • Attending Production support review calls before migrations and challenges involved in elements.
  • Modifying shell scripts to dynamically change the parameter variables used in mappings, start the ELT process, check for the success of the session and notify the status through email.
  • Working on AGILE methodology, creating maintaining Sprint Workbook’s and attending daily scrum meeting with team lead and team members.

Environment: Teradata 12, Flat Files, Teradata, PL/SQL, TOAD, Unix

We'd love your feedback!