Teradata Senior Developer Resume
Atlanta, GA
SUMMARY:
- 14.5 years of experience in end - to-end IT solution offerings in all phases of the development lifecycle involving Requirements Gathering, Analysis, Design, Development, Implementation, Acceptance Testing, Performance Tuning. Extensively used ETL methodology for supporting data extraction, transformations and loading processing in a corporate wide ETL Solutions. Hands on experience in Data Modeling.
- 13+ years IT experience in the Business Analysis, Design, Implementation and Testing of Applications using wide range of Technologies including Data Warehousing, Database in Banking Industry.
- Strong hands on experience using Teradata utilities (Teradata Parallel Transporter(TPT), Multi Load, Tpump, BTEQ, Fast Load).
- Strong in performance analysis, SQL query tuning using EXPLAIN PLAN, Collect Statistics.
- Excellent Experience with different indexes (Primary Index, Secondary Index, Join Index, Partition Primary Index).
- Good Experience in tuning complex SQL queries with less hardware resource consumption. Rich experience in writing Complex SQL queries to load fact table for analyzing the measures.
- 11+ years IT experience in using various Informatica Designer Tools like Source Analyzer, Warehouse Designer, Transformation Developer, Mapping Designer and Mapplet Designer.
- Excellent knowledge in identifying performance bottlenecks and tuning the Informatica Load for better performance and efficiency.
- Designed and developed various mappings, from varied transformation logic like Unconnected and Connected lookups, Router, Filter, Expression, Aggregator, Joiner, SQL transformation and handled complex XML source.
- 1+ years IT experience in rewriting Teradata queries to Google Bigquery. Strong knowledge of both relational and columnar database.
- Good experience in date table partition,denormalized(using array and Struct) and clustered tables.
- As a newcomer, I am highly passionate and continuously learning about machine learning and deep learning ideas. Also enriching myself by learning through online courses. Please find my personal project on Amazon Fine food reviews in the following gihub.
- 13+ years IT experience in writing complex unix scripts. Excellent in solving complex problems in unix
- Good Experience in Scheduling ETL jobs using Autosys and WLM.
- Good knowledge in building logical and Physical model for data warehouse and designed Dimensional models for data mart creation, allowing flexible high performance reporting of core data, Strong understanding of the principles of DW using Fact Tables, Data Mapping, Dimension Tables, star and snowflake schema modeling.
- Strong Experience in Quality Assurance using System integration testing, Process scenario testing, User acceptance testing, Query Optimization and Performance Tuning using SQL, Perform Quality assurance and testing using SQL scripts, Performance tuning migrated jobs from development to QA to Production environments.
TECHNICAL SKILLS:
ETL Tool: Informatica 8x,9X, Informatica Data Validation Option
Language: Python
Database Systems: Bigquery, Teradata 12,13,Oracle 10g,SQL Server
Operating systems: UNIX, Windows 98/NT/2000 and XP
Other Tools: Sybase Power Designer 16,MS Visio,TOAD,SAS
PROFESSIONAL EXPERIENCE:
Confidential, Atlanta, GA
Teradata Senior Developer
Responsibilities:
- Created dataset from Sales,Markdown and Inventory Tables in Bigquery.
- Calculated the Markdown rate for both RTL and Control Sku’s.
- Calculated the Benefit%
- Used partitioning the tables to reduce full table scan.
Environment: Google Bigquery.
Confidential
DeveloperResponsibilities:
- Converted many manual steps which are in SAS into automated tool using Teradata and Unix.
- All date related tables are partitioned so that partition elimination to avoid full table scan by the optimizer.
- Created Primary Index on joining keys and filter constraint in where clause.
- Analyzed the query using EXPLAIN plan to get to know about optimizer estimates whether confidence is high, low or no. Improved the performance to high confidence by collecting stats wherever necessary as it is costly operation.
- Used Unix script to get dates and gave in the WHERE clause filter of dates to achieve partition elimination.
- Created volatile tables to save memory space.
- Created tables with date partitioned to access only the date ranges which are required.
- Created denormalized tables using nested(Struct) and repeated data structure(array)
- Used Clustered tables on columns which are used in WHERE clause filter(order is important for multiple columns) for date partitioned tables.
- Used Datalab (Ipython Notebooks) for visualization and user self - requesting service.
- Created dataset from Sales,Auto Markdown excluding Manual Markdown in Bigquery.
- Calculated the Markdown rate for each fiscal week.
- Calculated aggregated Markdown rate at fiscal month level and compare whether it exceeds 40% company average.
- Used Clustered tables when aggregating and filtering the data.
Environment: Google Bigquery.
Confidential
DeveloperResponsibilities:
- Calculated the following metrics:-
- %SKU-Stores in clearance
- Sku-stores Stockout %
- Sell through %
- Markdown Rate%
- Inventory Post Clearance
Environment: Google Bigquery.
Confidential
DeveloperResponsibilities:
- Created R12 and R3 sales for Current and previous Fiscal year.
- Calculated CPI for lowes and Menards for Current and previous Fiscal year.
- Performed dense rank to get top 40 classes for each department and also at market level.
- Used nested data(Struct) and Repeated data (array) to avoid joins and shuffling respectively.
- Created R12 sales for Current and previous Fiscal year.
- Classify classes based on sales into 4 tickets.
- Created report for top and bottom 10 class drivers.
Environment: Google Bigquery.
Confidential
Responsibilities:
- Identify the Key Item Class(KIC) that drives customers to the stores.
- Identify all the transactions in the current year with Key Item Class as dominant and all Sku’s purchased along KIC.
- To eliminate project irrelevant sku’s the following 2 metrics are used
- This helps customer project shopping behavior.
- Used complex function like Lead and Last value function to group the 14 days window period.
Environment: Google Bigquery.
Confidential
DeveloperResponsibilities:
- Analysis, development, testing & support on Teradata.
- Used Bteq to load data into Teradata Table
Environment: Teradata 14, UNIX
Confidential, Atlanta, GA
Teradata Senior Developer
Responsibilities:
- Creating extract from Invoice tables, Product details at outlet, branch levels to provide a flat file to load into Oracle.TPT is used for file creation.
- Through Front End users, pricing related info for materials at branch level, which is stored in oracle tables
- The file generated from the oracle is loaded back to Teradata for reporting purpose.
- Analysis, development, testing & support on Teradata.
- Used TPT to load flat file into Teradata Temporary Table
- Working on extracts & manipulating data from the Teradata EDW through DS jobs and Unix Tivoli Jobs
Environment: Teradata 15, Tivoli and UNIX
Confidential, Richmond, VA
Senior Technical Lead/ Teradata Developer
Responsibilities:
- Analysis of business requirements for end-to-end ETL process.
- Worked closely with business analysts to understand the business needs for decision support data.
- Extraction, Transformation and Load was performed using Informatica Power Center. Developed various Sessions, Batches for all Mappings to load from source Table to flat file extract.
- Wrote shell scripts for data validation and reconciliation.
- Extensively worked on the performance tuning of mappings and sessions.
- Designed and Developed pre-session, post-session routines and batch execution routines using Informatica Server to run Informatica sessions.
- Developed Informatica jobs for Conversions which involved loading the flat file data into staging tables.
- Troubleshooted the ETL process developed for Conversions and implemented various techniques for enhancing the performance.
- Extensively involved in creating design documents for loading data into Datawarehouse and worked with the data modeler to change/update the Datawarehouse model when needed.
- Extensively used warehouse designer of Informatica Powercenter Created different target definitions.
- Created robust and complex Teradata Scripts and Troubleshooted data load problems.
- Working as Teradata & Informatica Technical lead.
- Development reviews and doubts clarifications with the team on regular basis.
- Understanding the existing system and implementing the different.
- Implemented performance tuning techniques to overcome spool space issue.
- Worked on the scheduling the Informatica jobs using WLM.
- Working as Teradata & Informatica Technical lead.
- Developed a complex file validation script in Unix.
- Development reviews and doubts clarifications with the team on regular basis.
- Implemented performance tuning techniques to reduce the execution time in informatica workflows.
- Used Teradata utilities like MultiLoad and also Teradata Parallel Transporter(TPT) load data into Teradata datawarehouse.
- Implemented partitioning at informatica and improved performance in a greater way.
- Worked on the scheduling the Informatica jobs using WLM.
Confidential, Charlotte, NC
Senior Technical Lead/ Teradata Developer/Analyst/ETL Developer
Responsibilities:
- Researched Sources and identified necessary Business Components for Analysis.
- Gathered the required information from the users.
- Interacted with different system groups for analysis of systems.
- Created tables, views in Teradata, according to the requirements.
- Created proper PI taking into consideration of both planned access of data and even distribution of data across all the available AMPS.
- Used Teradata utilities like MultiLoad, Tpump, and Fast Load to load data into Teradata data warehouse from complex XML and flat files.
- Wrote appropriate code in the conversions as per the Business Logic using BTeq scripts.
- Expertise in performance tuning the user queries. execution of frequently used SQL operations and improve the performance.
- Designing the ETLs and conducting review meets.
- Created BTEQ scripts to extract data from warehouse for downstream applications.
- Scheduled jobs for batch processing using Autosys Scheduler.
- Analyzing production support documents and finding the feasible time to run their jobs.
- Performance Tuning of sources, Targets, mappings and SQL queries in transformations.
- Developed unit test plans and involved in system testing.
- Lead a team of offshore developers from onshore.
- Involved in the meetings with SME’s (Subject Matter Expert) to understand the business requirements.
- Involved in Impact Assessment of the CR’s.
- Lead a team of offshore developers from onshore.
- Converting functional specification to ETL code.
- Writing UNIX Scripts for ETL process to have a flexible design to maintain scheduling the Informatica jobs in Autosys.
- Involved in interaction with the client and Business analyst in order to gather functionalities
- Doing Data Validation testing using both source & target system.
- Writing & Executing SQL queries for Data Validation testing for fact & dimensions.
- Involved in the requirement study, design and development.
- Extensive experience in UNIX shell scripting, FTP and file management in various UNIX environments.
- Identified common business functionalities among modules and implemented mapplet (code reusability) feature in Informatica.
- Converting functional specification to ETL code & Technical specification.
- Creating reusable code to re-use across different feed’s tuning ETL Process to maximize data flow and minimize time and hardware resources.
- Taking care of setting up for Development, Test, UAT and Production environment.
- Writing UNIX Scripts for ETL process to have a flexible design to maintain designing of the entire solution of a requirement raised by customers.
- Mentoring junior team members and providing training on technical environments and business processes.
- Organized knowledge transfer sessions to encourage teamwork and create more subject matter experts.
- Involved in CM Activities, Milestone, Metrics, Onsite Coordination and interaction with customer and Business user community from onsite.
- Responsible for reviewing the developed programs.
- Implemented SCD-II using combination of Teradata and Informatica thereby made the process to debug very easily.
Environment: Informatica 7X,8x, Informatica Power Exchange, Oracle 10g and UNIX
Confidential
Developer
Responsibilities:
- Used Source Analyzer and Warehouse Designer to import the source and target database schemas, and the mapping designer to map source to the target.
- Used Transformation Developer to develop the Joiner, filters, lookups, Expressions and Aggregation transformations that are used in mappings.
- Created Mappings using transformations like using Source Qualifier, Filter, Rank, Sequence Generator, Update Strategy, Lookup.
- Developed Joiner Transformation to extract data's from Multiple tables.
- Involved in Data Extraction from Oracle, Flat files using Informatica.
- Developed reusable Moppets using Mapplet designer.
- Created and executed sessions using Workflow Manager.
- Fixing the new changes in developed mappings.
Environment: Informatica 8.6.1, Oracle 10g and UNIX