Teradata Developer, Resume
Pleasanton, CA
SUMMARY
- Over 10 + Years of IT Experience in development of Enterprise Data Warehouse applications using Informatica, Oracle and Teradata.
- Experience in all phases of Data warehouse development from Requirements, analysis, design, development, testing and post production support.
- Strong in - depth knowledge in doing data analysis, data quality and source system analysis.
- Independent, Self-starter, enthusiastic team player with strong adaptability to new technologies.
- Experience in Big Data Technologies using Hadoop, Sqoop, Pig and Hive.
- Excellent track record in delivering quality software on time to meet the business priorities.
- Developed Data Warehouse/Data Mart systems, using various RDBMS (Oracle, MS-SQL Server, Mainframes, Teradata and DB2).
- Highly Proficient in using Informatica Power Center, Power Exchange and explore on Informatica Data Services.
TECHNICAL SKILLS
- ETL Tools: Informatica, Data Stage, SSIS,
- Databases: Teradata 12/13/14, Oracle 9i/10g/11g/12c, MySQL, SQL Server 2000/2005, MS Access, DB2, Hadoop (HDFS),
- GUI:.Net Custom development, Business Objects, Micro Strategy,
- Operating Systems: Windows, Unix, Linux,
- Languages: C#, VB Script, HTML, DHTML, Java Script, SQL, PL/SQL, Unix Shell, Python, Hive, Pig,
- Web Related:ASP.NET, VB Script, HTML, DHTML, JAVA, Java Script,
- Tools & Utilities: Teradata Parallel Transporter, Aprimo 6.1/8.X, Bteq, SQL Assistant, Toad, SQL Navigator, SQL*Loader, $U, HP Quality center, PVCS, Data Flux, UC4, Control-M,
- Domain Knowledge: Banking, Finance, Insurances, Health Care, Energy,
- B.Tech. in Computer Science, JNTU University, Hyderabad, India,
PROFESSIONAL EXPERIENCE
Teradata developer,
Confidential, Pleasanton, CA
Responsibilities:
- Worked with Safeway and united business users to capture data requirements and transformational rules between source systems with Albertsons EDW.
- Developed Mapping and design documents for Sales, Promotional and Marketing data.
- Performed data profiling, source systems analysis to understand data and quality issues.
- Developed BTEQ scripts to transform from Stage to 3rd NF and then to aggregate.
- Tuned Complex Teradata queries to meet performance level agreements using statics, Indices, and Partitioning Techniques.
- Multiload, Fast load, BTEQ, Created, modified databases, performed capacity planning.
- Generating Flat files from Teradata 3NF tables using Teradata Fast Export utility, and
Environment: Teradata15.10, Teradata Viewpoint, Teradata Studio, Unix.
ETL Developer,
Confidential, Sacramento, CA
Responsibilities:
- Source System Analysis and provide input to data modeling, and developing ETL design document as per business requirements.
- Design, Developing and testing of the various Mappings and Mapplets, worklets and workflows involved in the ETL process.
- Developed and Integrated Data Quality measures into ETL frame work using Informatica Data Quality ( IDQ ).
Environment: SQL Server, PL/SQL, MySQL, Informatica Power Center 9.1/9.6, Windows XP, UNIX Shell scripting.
Teradata Performance Engineer/ETL developer,
Confidential, CA
Responsibilities:
- Performance tuning, including collecting statistics, analyzing explains & determine which tables needed stats. Increased performance by 50-75% in some situations.
- Develop complex SQL queries to identify the performance bottle necks in the processing.
- Profound understanding of Banking Campaign management experience.
- Working with Business analyst to develop productive models and advance SQL statements.
- Multiload, BTEQ, created & modified databases, performed capacity planning.
- Developed Sqoop Jobs to integrate Data from Oracle and Teradata for application migration.
- Transformed data from STG Tables into Final Tables using Hive Scripts.
- Developed PIG scripts for Analysts to analyze data on HDFS.
- From Hadoop used to refine and analyze clickstream data
- Deliver new and complex high quality solutions to clients in response to varying business requirements and Creating and managing user accounts.
- Refreshed the data by using fast export and fast load utilities.
- Developed Informatica mappings for source to target loading from BODI to TP.
- Worked on Aprimo Integration/Customization and configuration.
- Source System Analysis and provide input to data modeling, and developing ETL design document as per business requirements.
Environment: Teradata12/14.10, Teradata Viewpoint, Aprimo 6.1/.8.X, Informatica Power Center 8.6/9.1, Pig, Hive, Oracle 10g/11g/12c, PL/SQL, Informatica 9.5, Windows, HP Quality center, Unix.
ETL and Teradata Developer,
Confidential, CA
Responsibilities:
- Analysis, Design, Development, Testing and Deployment of Informatica workflows, BTEQ scripts, Python and shell scripts.
- Source System Analysis and provide input to data modeling, and developing ETL design document as per business requirements.
- Design, Developing and testing of the various Mappings and Mapplets, worklets and workflows involved in the ETL process.
- Developed and Integrated Data Quality measures into ETL frame work using Informatica Data Quality ( IDQ ).
- Experience in data profiling using IDQ for input into ETL Design and Data Modelling.
- Extensively used ETL to transfer data from different source system and load the data into the target DB.
- Developing Informatica mappings with the collection of all Sources, Targets, and Transformations using Informatica Designer.
- Extracting data from various sources across the organization (Oracle, MySQL, SQL Server and Flat files) and loading into staging area.
Environment: Teradata13.10/14.0, Oracle 10g/11g, PL/SQL, MySQL, Informatica Power Center 8.6/9.1, Power Exchange, IDQ, OCL Tool, UC4, Control-M, ER Viewer, Business Intelligence, Windows, HP Quality center, Unix, Linux.
ETL Developer,
Maryland State, Annapolis, MD,
Responsibilities:
- Developed Low level mappings for Tables and columns from source to target systems.
- Wrote and optimized Initial data load scripts using Information and Database utilities.
- Using Partitions to extract data from source and load it to Teradata using TPT load with proper load balance on Teradata server.
- Wrote Complex Bteq scripts to in Confidential Business functionality in transforming the data from Staging into 3rd normal form.
- Participated in Teradata Upgrade project to upgrade from TD12 to TD13.10 to conduct regression testing.
Environment: Teradata12/13.10, Oracle 10g/11g, PL/SQL, MySQL, Informatica Power Center 8.6/9.1, SSIS, SSRS,ER Viewer, Windows XP, HP Quality center, UNIX Shell scripting.
Senior ETL Developer,Owings Mills, MD,
Responsibilities:
- Created Uprocs, Sessions, Management Unit to schedule jobs using $U.
- Conduct source System Analysis and developed ETL design document to meet business requirements.
- Tuned Teradata Sql queries and resolved performance issues due to Data Skew and Spool space issues.
- Developed Flat files from Teradata using fast export, Bteq to disseminate to downstream dependent systems.
Environment: Teradata V2R6/12, Oracle 9i/10g PL/SQL, Informatica Power Center 8/8.6, $U, Business Objects, SSIS, Windows XP, UNIX Shell scripting.