Etl And Teradata Developer Resume
CA
PROFESSIONAL SUMMARY:
- Over 7 Years of IT Experience in development of Enterprise Data Warehouse applications on Oracle and Teradata.
- Experience in all phases of Data warehouse development from Requirements, analysis, design, development, testing and post production support.
- Strong in - depth knowledge in doing data analysis, data quality and source system analysis.
- Independent, Self-starter, enthusiastic team player with strong adaptability to new technologies.
- Excellent track record in delivering quality software on time to meet the business priorities.
- Profound knowledge in Data warehouse methodologies and Data integration technologies.
- Ability to understand and implement both technology and business concepts & processes.
- Ability to work independently and in a team environment.
COMPUTER EXPERTISE:
ETL Tools: Informatica 6.x/7.x/8.x/9.1, Data Stage
Databases: Teradata 12/13.10,Oracle 11g/10G, SQL Server 2000, MS Access
GUI: .Net Custom development, Business Objects, Micro Strategy
Operating Systems: Windows XP/7, UNIX, LINUX
Languages: C#, VB Script, HTML, DHTML, Java Script, SQL, PL/SQL, Unix Shell scripting.
Web Related: ASP.NET, VB Script, HTML, DHTML, JAVA, Java Script
Tools: & Utilities: Teradata Parallel Transporter, Bteq, SQL Assistant, Toad, SQL Navigator, SQL*Loader, $U, HP Quality center, PVCS, Kintana, Data Flux, UC4
Domain Knowledge: Finance, Insurances, Health Care, Energy
PROFESSIONAL EXPERIENCE:
ETL and Teradata Developer
Confidential, CA
Responsibilities:
- Analysis, Design, Development, Testing and Deployment of Informatica workflows, BTEQ scripts, and shell scripts.
- Source System Analysis and provide input to data modeling, and developing ETL design document as per business requirements.
- Developed BTEQ scripts to transform from Stage to 3rd NF and then to aggregate.
- Tuned complex Teradata queries to meet performance level agreements using Statistics, Indices, and Partitioning Techniques.
- Developing Informatica mappings with the collection of all Sources, Targets, and Transformations using Informatica Designer.
- Developing Mappings with various Transformations like Expression, Normalizer, Union, Filter, Router, Joiner and Lookups for better data messaging and to migrate clean and consistent data
- Extracting data from various sources across the organization (Oracle, SQL Server and Flat files) and loading into staging area.
- Developing Informatica Mappings and Workflows to extract data and to load into Teradata staging area using FastLoad/Tpump utilities.
- Developed Informatica mappings for source to target loading using TPT connectors(load, update, export and stream)
- Using Partitions to extract data from source and load it to Teradata using TPT load with proper load balance on Teradata server.
- Generating Flat files from Teradata 3NF tables using Teradata Fast Export utility, and then FTP them using shell script to a different UNIX server for the Application team’s consumption.
- Creating Job Events, Job Plans, schedule jobs using UC4.
- Working closely with QA Team, Release Engineering Team and Operations Team and provide timely response for Bugs and Issues.
- Documenting Production Support docs, Test cases and Transition documents.
- Working knowledge on MicroStrategy BI tools to validate metrics.
- Experience with Agile/Scrum development methodologies
Environment: Teradata13.10/14.0, Oracle 11g/10g, Informatica Power Center 8.6/9.0, Power Exchange, OCL Tool, UC4, Control-M, ER Viewer, Business Objects, Windows XP, UNIX, LINUX.
ETL Developer
Confidential, Annapolis, MD
Responsibilities:
- Documented data and report requirements from Business users and other stakeholders.
- Worked on Source systems to analyze data and provide input to Physical Database design.
- Developed Low level mappings for Tables and columns from source to target systems.
- Wrote and optimized Initial data load scripts using Information and Database utilities.
- Developed Informatica mappings for source to target loading using TPT connectors(load, update, export and stream)
- Using Partitions to extract data from source and load it to Teradata using TPT load with proper load balance on Teradata server.
- Wrote Complex Bteq scripts to in corporate Business functionality in transforming the data from Staging into 3rd normal form.
- Developed Aggregate scripts both incremental and complete refresh using Bteq.
- Worked with Teradata DBA team in tuning Bteq scripts to load and transform data.
- Used Collect statistics, Secondary Indexes, Join indexes and Materialization as required to meet Performance level agreements.
- Participated in Code review for performance optimization and standard adherence.
- Developed Extract Jobs using Informatica to feed downstream applications.
- Developed and executed Unit test cases on all ETL and Bteq Scripts.
- Participated in Teradata Upgrade project to upgrade from TD12 to TD13.10 to conduct regression testing.
- Troubleshooting the defects and fixing them.
- Transferred knowledge to Operations and support teams.
Environment: Teradata12/13.10, Oracle 11g/10g, Informatica Power Center 8.6/9.1, ER Viewer, Windows XP, UNIX, Linux.
Confidential, Owings Mills, MD
Senior ETL Developer
Responsibilities:
- Conduct source System Analysis and developed ETL design document to meet business requirements.
- Developed Informatica Mappings and Workflows to extract data from PeopleSoft, Oracle, CSV files to load into Teradata staging area using FastLoad/Tpump utilities.
- Developed ETLs to load data from source to 3NF, stage to 3NF and Stage area to Work, work to 3NF using Informatica Push Down optimization technique to utilize Database processing power.
- Designed and developed custom Data Quality audits to identify and report the data mismatch between source and target systems and alert Operations Team.
- Tuned Teradata Sql queries and resolved performance issues due to Data Skew and Spool space issues.
- Created Uprocs, Sessions, Management Unit to schedule jobs using $U.
- Developed Flat files from Teradata using fast export, Bteq to disseminate to downstream dependent systems.
- Coordinated with the offshore project team members on daily basis for the continuation of tasks and resolving any issues.
- Supported System Integration and User acceptance tests to obtain sign off.
- Post go live Production Support and Knowledge Transfer to Production Support team
Environment: Teradata V2R6/12, Oracle 9i/10g, Informatica Power Center 8/8.6, $U, Business Objects, Windows XP, UNIX, LINUX
Confidential, Temple, TX
ETL Developer
Responsibilities:
- Documenting functional specifications and other aspects used for the development of ETL mappings
- Worked with the Business Analysts for requirements gathering, business analysis, testing, and project coordination
- Documented user requirements, translating requirements into system solutions and developing implementation plan
- Developed core components of the project which includes XML, Validation of XSD and created well defined Views in Pre-Staging area and Load them
- Developing a number of Complex Mappings, Mapplets and Reusable Transformations using Informatica Designer to facilitate daily and monthly loading of data
- Optimized Performance of existing Informatica workflows.
- Scheduled Informatica Workflows using workflow manager.
Environment: Oracle 9i, SQL Server 2000,DB2, Informatica Power Center 7.1, Erwin, Cognos, XML, Windows NT/2000, Unix
Confidential, Minnesota, MN
ETL Developer
Responsibilities:
- Developed various Mappings with the collection of all Sources, Targets, and Transformations using Informatica Designer
- Developed Mappings using Transformations like Expression, Filter, Joiner and Lookups for better data messaging and to migrate clean and consistent data
- Extracted data from various sources across the organization (Oracle, SQL Server and Flat files ) and loading into staging area
- Created and scheduled Sessions and Batch Process based on demand, run on time, or run only once using Informatica Workflow Manager and monitoring the data loads using the Workflow Monitor
- Participated in Testing and performance tuning by identifying bottlenecks in mapping logic and resolving them, setting cache values and creating partitions for parallel processing of data
- Developed Reports in Cognos Impromptu and Power play
Environment: Oracle 9i, SQL Server 2000, PL/SQL, Informatica Power Center 7.1, Erwin, Cognos, Windows NT/2000, UNIX