Etl/teradata Developer & Techno-functional Lead Resume
SUMMARY:
- Software Professional having 7 years of IT experience with strong background in Data Warehousing with emphasis on Business Requirements Analysis, Application Design, Development, Testing and Implementation in Production and very good exposure in managing large projects & communicate with end client users.
- Technical expertise in Teradata, ETL Tool as Informatica, Vertica, Oracle PL/SQL, DB2 and having good exposure to Unix shell scripting, Python, Unica, Control - M, Autosys scheduling and IBM Tivoli scheduler & Reporting tools i.e., Tableau, Business Intelligence & UI Portal dashboards.
- Strong knowledge of Teradata RDBMS Architecture (AMP, PE, BYNET and Data distribution). Extensively created and used various Teradata Set Tables, Multi-Set Tables, Global Tables, Volatile Tables & Temp Tables.
- Hands on experience in Teradata SQL Analytics, Teradata utilities and familiar in Creating Secondary indexes, and join indexes in Teradata.
- Strong working experience on Teradata query performance tuning by analyzing CPU, AMP Distribution, Table Skewness and IO metrics.
- Extensively worked on Teradata Utility tools like Bteq, Fast load, Fast Export, Multi Load, Tpump and TPT.
- Written complex SQL procedures to perform aggregations in semantic layer. Also written complex procedures to perform dynamic query generation for UI dashboard reporting
- Very Strong in writing complex, large & Analytical SQL queries to help client in reporting the data access quickly.
- Strong experience in Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP/OLTP, BI applications.
- Hands on experience in developing OLAP reports and dashboards in Business Objects, Dashboard, UI Portal.
- Experience in all phases of Data warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing, Production and Documenting.
- Experience in Pinpointing longest running queries, troubleshooting performance issues and optimize query performance in datawarehouse
- Worked on automation of data replication across multiple Teradata environments and to Oracle & Vertica
- Good understanding of Conceptual, Logical and Physical designs and familiar with ER diagrams.
- Written dynamic Oracle procedure to fetch real time data using API call.
- Optimized the Oracle queries by using different tuning techniques like using hints, parallel processing, and optimization rules. Written SQL and PL/SQL scripts to create & drop database objects including tables, views, primary keys, indexes, and constraints.
- Extensively worked with Oracle PL/SQL Stored Procedures, Triggers, Functions and involved in Query Optimization and real time API call for reporting purpose.
- Good knowledge on Vertica platform Architecture and experience in Loading data, backing up, restoring and recovery into Vertica database
- Hands on experience in writing complex & large SQL queries in Vertica.
- Created Vertica projections, views, tables. Written Vsql script to load data from datawarehouse table to Vertica DB.
- Good experience in query tuning for better performance in Vertica.
- Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting.
- Having working knowledge on creating EAI interface.
- Extensive ETL experience using Informatica 10.1.0 (Power Center) (Designer, Workflow Manager, Workflow Monitor, Mappings and Mapplets).
- Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger)
- Designed complex mappings using ETL Transformations. Implemented Slowly Changing dimensions methodology for accessing/maintaining the history of accounts and transaction information.
- Worked on Mapping re-usability using Workflow Variables, Mapping Variables and Mapping Parameters
- Good knowledge on migration of the ETL interface from one repository to another repository
- Identified bottlenecks in ETL Processes and improved the performance using Partitioning, Index Usage, Aggregate Tables, and Normalization/De-normalization strategies.
- Worked on Semantic migration from one production environment to another environment and ETL to Storm interface migration.
- Having experience in using CA workload automation tool and IBM Tivoli scheduler for Autosys job creation, scheduling and monitoring and promotion into production.
- Good working knowledge in Data Modeling using Dimensional Data modeling, Star Schema/Snow flake schema, FACT & Dimensions tables, Physical & logical data modeling.
- Developed visualizations on Tableau to visually update and generate the report as per the business need.
- Good knowledge on Python languages.
- Always humble to learn innovative technologies/tools and Highly motivated with the ability to work effectively & efficiently in teams as well as independently.
TECHNICAL SKILLS:
ETL Tools: Informatica Power Center 10.1.0, Storm
Databases: Teradata 15.0, Vertica, Oracle
Teradata Utilities: Bteq, Fast load, Fast Export, Multi load, Tpump and TPT
Languages: PL/SQL, Unix shell scripting, Putty and Python
Reporting Tools: Business Objects, Tableau, Web Intelligence, Portal Dashboard
Tools: Aqua Data studio, Teradata SQL Assistant, SQL Workbench, SQL developer, Toad for DB2
GIT, Subversion: CA workload Automation tool, IBM Tivoli workload Scheduler, Control-M
Operating Systems: Mac OS, Windows XP, 7,10
PROFESSIONAL EXPERIENCE:
Confidential
ETL/Teradata Developer & Techno-Functional Lead
Environment: ETL Tool Informatica 10.1.0, Teradata 15.0, SQL Developer, UNIX Shell Scripting, IBM Maestro Tivoli Workload Scheduler, Putty and Toad for DB2
Responsibilities:
- Understanding business requirements and converting them into technical specifications.
- Analyzed the business requirement, designed the code, developed and Implemented in production
- Created the informatica mappings, workflows to fetch data from various source systems into Teradata tables.
- Written scripts for loading huge volume of data from legacy systems to target data warehouse using BTEQ, FLoad and MLoad utilities.
- Written Unix shell script to check & process the data file, also for executing Bteq scripts to load data into Teradata tables.
- Created the tables, views & Bteq SQL script for data loading. And replicated the data from Oracle to Teradata.
- Worked on performance tuning the long running & high CPU consumption queries for better performance.
- Created schedules/jobs to load millions of data effectively into Teradata using IBM scheduling tool.
- Written SQL Scripts to extract the data from Database for Testing Purposes.
- Written procedure & function to load data into oracle tables.
- Created informatica jobs to extract the data from EDW to load data into CDW DB2 tables.
- Prepared the design, technical specification documents. And also involved in Dev, IT,UAT and implementing in production.
- Provided the support during the production go live.
- Worked in Application production support as a Tier-2 Lead by identifying and troubleshooting application related issues and finding root cause for the data issues.
- Provided support with resolution of escalated tickets and act as liaison to business and technical leads to ensure issues are resolved in timely manner
- Suggest fixes to complex issues by doing a thorough analysis of root cause and impact of the defect
- Assisted the technical team in identification and resolution of Data Quality issues
- Strong analytical, and problem-solving abilities as well as exceptional customer service orientation
- Ability to communicate clearly with business users and translate from business requirements to technical specifications
- Automation of manual effort required by the production support team.
- Created a robust framework, using stored procedures and shell scripts, which articulate the whole ETL Process.
- Interface daily with cross functional team members in EDW team and across the enterprise to resolve issues.
- Proficient in handling production incidents (critical issues) and production related issues involving multiple stakeholders/vendors
ETL Developer/ Teradata /Vertica Developer & Technical Lead
Environment: ETL Informatica, Teradata 14.0, Vertica, SQL, UNIX Shell Scripting, Autosys Jobs, Tableau
Responsibilities:
- Analyzed the business requirement, designed, developed and successfully implemented in production.
- Created Mappings, workflows and Mapplets to full data from multiple source to Teradata database using Informatica powercenter designer tool.
- Developed the 33 inbound core interfaces to fetch all iPerform data from Oracle to Teradata using ETL Tool.
- Created the Outbound interface to send data to external source team thru EAI interface.
- Written Teradata procedures to load incremental/aggregated data from Core to Semantic layer of Teradata.
- Created the Replication set up to replicate data from Teradata to Teradata, Teradata to Oracle and Vertica.
- Written Teradata Utility scripts to load the data from file or to extract and load the data from one environment to another for data Sync up.
- Written the Oracle dynamic procedure to fetch data in a real-time API call.
- Written Shell Programs to execute workflows/TD procedures through command jobs.
- Created the Autosys JIL files to automate the jobs execution.
- Created Tables, Views, Indexes and Procedures in Teradata.
- Compiled/Run the Teradata objects using Bteq.
- Created Tables, Views, Synonyms, Triggers and Procedures in Oracle.
- Created the DLD (Design level Doc) and KT Supporting docs.
- Created Vsql script to load data into Vertica tables.
- Created tables, views and projections in Vertica.
- Replicated the data from Teradata to Vertica and tuned the SQL queries for better performance for reporting purpose.
- Prepared the test specifications and test logs.
- Provided Primary and Secondary support.
- Involved in UT, IT, UAT and Implementation in Production.
- Created and generated the reports of advisor performance using Tableau as per the business need.
System Analyst 1 & Techno-Functional Lead
Environment: Teradata, Vertica, SQL, UNIX Shell Scripting, Autosys Jobs
Responsibilities:
- Analyzed the business requirement, designed, developed and implemented in production.
- File System Set Up - migrated all UNIX scripts from the old environment to new production system.
- Created new Autosys jobs for all the databases (Teradata and Vertica)
- Created the new ETL/Replication MD set up for all the semantic applications.
- Migrated all UDM and Outbound interfaces to new production system.
- Set up the databases and copied all data into new environment.
- Created the new GDT set up to perform code deployment during implementation.
- Worked on Pre-Failover, Failover and Post-Failover activities.
- Validated all semantic table data for the new setup.
System Analyst 1 & Techno-Functional Lead
Environment: ETL Tool, Teradata 14, SQL, Unica, Autosys Jobs, Oracle, UNIX Shell Scripting, OSX
Responsibilities:
- Analyzed the requirement from Business and designed the application.
- Created both functional and technical design documents
- Written Teradata procedures to load incremental/aggregated data from Core to Semantic layer of Teradata.
- Written Teradata Utility scripts to load the data from file or to extract and load the data from one environment to another for data Sync up.
- Developed the outbound interfaces to generate the feeds and send these feeds to the WWBI for reporting purpose thru EAI using ETL Tool.
- Written Autosys JIL files to automate the jobs execution.
- Created Tables, Views, Indexes and Procedures in Teradata.
- Compiled/Run the Teradata objects using Bteq.
- Created Tables, Views, Synonyms, Triggers and Procedures in Oracle.
- Created the DLD (Design level Doc) and KT Supporting docs.
- Prepared the test specifications and test logs.
- Involved in UT, IT, UAT and production deployment.
Teradata Developer &Techno-Functional Lead
Environment: Teradata, Shell Scripting, SQL, Autosys Jobs, UNIX, OSX
Responsibilities:
- Worked on business requirement gathering, designing, development and production implementation.
- Written Teradata Utility scripts to export the data from Primary System and to load data into Secondary System
- Developed the procedures to dynamically generate the insert and update script in the secondary system
- Developed Teradata macros to generate TD utility scripts automatically
- Written Autosys JIL files to automate the jobs execution
- Created Tables, Views, Indexes and Procedures in Teradata
- Compiled/Run the Teradata objects using Bteq
- Created the DLD (Design level Doc) and KT Supporting docs
- Prepared the test specifications and test logs
- Provided Primary and Secondary support
- Involved in UT, IT, UAT and Implementation in Production
Programmer Analyst
Environment: Teradata 13, Informatica Power Center, SQL, UNIX Shell Scripting, Autosys Jobs
Responsibilities:
- Analyzed the business requirement, designed, developed and implemented in production.
- Created Mappings, workflows and Mapplets to full data from multiple source to Teradata database using Informatica designer tool.
- Written Teradata procedures to load incremental/aggregated data from Core to Semantic layer of Teradata.
- Developed the replication set up to replicate data into secondary/tertiary environments.
- Written the dynamic procedure to form SQL query or to get the aggregated data to display on to the dashboard.
- Written Shell Programs to execute workflows/TD procedures through command jobs.
- Setup the Autosys JIL files to automate the jobs execution.
- Created Tables, Views, Indexes and Procedures in Teradata.
- Compiled/Run the Teradata objects using Bteq.
- Performance tuning of TD procedures for better SLA
- Prepared the test specifications and test logs and provided Primary and Secondary support.
- Involved in UT, IT, UAT and Implementation in Production.