Teradata/informatica Developer Resume
IL
SUMMARY:
- Around 5yearsof experience in ETL (Extract Transform Load), projects using Data Warehousing tools like Informatica and databases like Oracle, MySQL, SQL Server, Teradata and DB2 UDB
- Assisted Business Analyst with analyzing the requirements, implementing design and development of various components of ETL for various applications
- Developed comprehensive models for managing and communicating the relationships between Base Objects and Performed the ETL process for loading the data into hub
- Able to interact , communicate and support effectively with team members and provide suggestions for the development activity
- Comfortable with technical and functional applications of Data Mapping, DataManagement, Data transportation, Data Staging, Designing and development of Relational Database Systems (RDBMS)
- Extensive experience on Star Schema and SnowflakeSchema, Fact Dimension Tables, Slowly Changing Dimensions. Interacting with clients and users as well to know their requirements
- Expertise in creating transformations and mappings using Informatica Designer. Expertise in Business Model development with Aggregation Rules, Cache Management, Dimensions, Hierarchies, Measures, Partitioning, and TimeSeries
- Familiar with creating Secondary indexes, join indexes. Finalized business requirements with business stakeholders / users.
- Good knowledge on Github for maintaining versioning of code and Programming experience in BTEQ, Macros and Triggers. Also expertise in TeradataRDBMS using Fastload, Multiload, Tpump, Fastexport, Multiload, Teradata SQL Assistant and BTEQ utilities.
- Involved in implementing and translating the business requirements to High Level and Low - level design for ETL processes in Teradata. Experience in issue tracking and agile project management using JIRA.
- Implemented Type 1 and Type 2 methodologies in ODS tables loading to keep historical data in data warehouse.
- Had knowledge on Kimball/Inmon methodologies and in implementing complex business rules by creating Robust Mappings, Reusable Transformations using Transformations like Aggregator, Connected Look Up, Unconnected Look Up, Expression, Filter, Joiner, Router, Update Strategy etc.
- Sound knowledge of all phases of the Software Development Life Cycle ( SDLC ) Methodology and Agile method .
TECHNICAL SKILLS:
Informatica (9.x) applications: Informatica (Power Center Tools and Metadata Reporter)
Data Migration and Data Quality: Teradata (Mload, Fastload, FastExport, Tpump,TPT, Bteq)
Data Warehousing and ETL Tools: TeradataManager, DBQL, TDQL, DBW, PriorityScheduler
Databases: Teradata V2R5/V2R6.x/12.0/13.x/14.10, TTU12, SQL/MYSQL server and Oracle
DB Modeling/Case tool: ER Studio, Erwin4.0/3.5, MS Visio Professional. Star/ Snowflake
ETL Tools: Informatica Power Center 9.x/8.x/7.x, TWS, GITHUB
Programming Skills: C++, Java, SQL, PL/SQL, Teradata SQL, Oracle SQL, UNIX Shell Scripting
Operating Systems: AIX, Sun Solaris, Windows 95/98/2000/XP, UNIX, LINUX
Reporting Tools: Business Objects XI, Crystal Reports 10, Oracle Reports 2.5, MS Office Suite
Teradata Utilites: Analyst Pack, BTEQ, SQL Assistant, Database Window, FastLoad, MultiLoad, Fast Export, TPT
Teradata DBA Utilities: VIEWPOINT, Query Monitor
Scheduling tool: Autosys, Control-M Scheduler
WEB Technologies: Ajax, Boot Strap, CSS, DOJO, Java Script, JQuery, HTML, XHTML, Ember JS, Node JS, REST, Swings.
PROFESSIONAL EXPERIENCE:
Confidential, IL
Teradata/Informatica Developer
Responsibilities:
- Analyzing business requirements, designs, creating and maintaining source-target mapping documents for ETL development team. Also specifying technical specifications to design/redesign solutions.
- Built and deployed applications using Cloud Foundry and integrated with development tools
- Created shell scripts to fine tune the ETL flow of the Informatica workflows. Creating the Bteq scripts for processing the Acxiom InfoBase data into CDI data ware house tables.
- Developed the Teradata Macros, Stored Procedures to load data into Incremental/Staging tables then move data from staging to Journal then move data from Journal into Base tables
- Extracted data from various source systems such as Oracle, Sql Server and flat files as per the requirements provided scalable, high speed, parallel data extraction, loading and updating using TPT.
- Following the AGILE methodology for SDLC. Involved in Sprint Planning, Scrums and Retrospective meetings.
- Requirements gathering and creating functional and technical design documents study, business rules, data mapping and workflows.
- Developed mappings in Ab Initio to load the data from various sources using various Ab Initio Components such as Partition by Key, Partition by round robin, Reformat, Rollup, Join, Scan, Normalize, Gather, Merge etc.
- Involved in data migration of online data from SQL server to Hadoop environment using Sqoop statements.
- Involved in development of user interface applications and professional web applications using Ajax, HTML5, XHTML, CSS3, JavaScript, JQuery, JSON, Xml, Node JS, Bootstrap and Angular JS. Additionally, implementing functionalities like searching, filtering, sorting, validating using Angular JS and Java Script.
- Knowledge on Views, Teradata Triggers, Macros and User defined Functions.
- Used Java 8.0 features like Lambda Expressions with COGNITO, Dynamo DB, and Redshift, Lambda, AWS Lambda.
- Migrated repository objects, services and scripts from development environment to production environment. Experience in troubleshooting and solving migration issues and production issues.
- Implemented data access layer i.e. DAO Classes using Hibernate and spring as an ORM tool and configured XML files per Hibernate and Spring Framework. Effectively worked on Onsite and Offshore work model.
Environment: Informatica Power Center 9.1/9.5, SAS DI Studio 9.4, Teradata 1 .x/12.x, Oracle 10G/11G, UNIX, SQL Server 2012/2008, Java, J2EE, Spring, Control-M, Hadoop, and Windows.
Confidential, Dallas, TX
Teradata Developer.
Responsibilities:
- Participating in weekly all-hands meeting with onsite teams, participating in client interactions to resolve any business logic issues/concerns, participated in knowledge transition activities, escalate any techno-functional issues to the leads, and ensuring about the task timelines and quality.
- Identifying/Analyzing root-cause for the bugs/defects raised by Users and QA team by prioritizing them based on issue criticality
- Resolving issues related to Enterprise data warehouse (EDW), Stored procedures in OLTP system and analyzed, design and develop ETL strategies. Analysed and reviewed functional requirements, mapping documents and troubleshooting, problem solving.
- Extracted data from Teradata into HDFS using the Sqoop and Exported the patterns analyzed back to Teradata using Sqoop.
- Troubleshooting of long running sessions and fixing the issues related to it and Tuned the mappings by removing the Source/Target bottlenecks and Expressions to improve the throughput of the data loads.
- Used Informatica Designer to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to DataMart.
- Very good understanding of Database Skew, PPI, Join Methods and Join Strategies, Join Indexes including sparse, aggregate and hash.
- Worked on InformaticaPower Center tools like Designer, Repository Manager, Workflow Manager, and workflow Monitor. Working knowledge on Micro-Services and AWS.
- Experienced working with XML technologies like DTD, XML, XSL, XSD, XSLT and parsing XML API with SAX and DOM parsers.
- Extracted, Transformed, loaded data into data warehouse using InformaticaPowercentre and generated various reports on a daily, weekly monthly and yearly basis.
- Extensively used Netezza Utilities to load and execute SQL scripts using Unix. Analyzed highly complex data sets, performing ad-hoc Data analysis and data manipulation and created STM which had complex transformation.
- Created BTEQ script for pre-population of the work tables prior to the main load process and Created Primary Indexes for both planned access of data and even distribution of data across all the available AMPS.
Environment: Teradata 12.0/13.0/14.0/15.0, Informatica Power Center 9.0.1, DB2, ERWIN, SSH (secure shell), Fastload, Mload, UNIX, TOAD, SQL server, GITHUB, VSS, WinSCP.
Confidential
ETL Developer.
Responsibilities:
- Designed and developed Mapplets, Reusable transformations, Reusable sessions and created parameter files wherever necessary to facilitate reusability
- Effectively worked in Informatica version based environment and used deployment groups to migrate the objects.
- Fixed the invalid mappings and troubleshoot the technical problems of the database.
- Developed various transformations like Source Qualifier, Sorter transformation, Joiner transformation, Update Strategy, Lookup transformation, Expressions and Sequence Generator for loading the data into target table.
- Extracted data from Oracle and SQL Server then used Teradata for data warehousing.
- Developed FRD (Functional requirement Document) and data architecture document and communicated with the concerned stakeholders. Conducted Impact and feasibility analysis
- Experience in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
- Used Query Studio for creating Ad-hoc Reports. And worked with heterogeneous source to extract data from Oracle database, XML and flat files then loaded to a relational Oracle warehouse.
- Implementation of documentation standards and practices to make mappings easier to maintain and experience in performance tuning of the Informatica mappings using various components
- Extensive experience on different types of transformations like aggregator, lookup, source qualifier, expression, filter, update strategy, stored procedure, sequence generator, joiner, XML.
- Active involvement in the production support along with knowledge transfer and support to the other team members.
Environment: Informatica 8.6.1, PowerExchange 8.1, Oracle 11g / 10g/9i/8i, SQL server 2005, UNIX, TOAD Quest Software, Windows NT 4.0.
Confidential
ETL Analyst.
Responsibilities:
- Performed unit testing at various levels of ETL and actively involved in team code reviews.
- Developed data Mappings from source systems to warehouse components and also created and monitored batches and sessions using Informatica PowerCenter Server
- Created ReusableTransformations and Mappings using InformaticaDesigner and processing tasks using Workflow Manager to move data from multiple sources into targets.
- Expertise in developing Oracle PL/SQL packages, procedures and functions. Additionally, Experienced in Oracle Warehouse Builder to implement changes to operational data store.
- Involved in the performance tuning of the programs, ETL Procedures and processes
- Performed tuning of SQL queries and Stored Procedure for fast extraction of data to troubleshoot and resolve issues in OLTP environment.
- Experience in Requirement gathering and Business Analysis. Wrote PL/SQL Packages, Stored procedures to implement business rules and validations.
- Responsible to deal with ETL procedures and STAR Schemas to optimize load and query performance. Involved in creating universes and generated reports using Star Schema
- Used Teradata as a source system and prepared Technical Design documents and Test cases.
- Used Autosys for scheduling various data cleansing scripts and loading processes, maintained the batch processes using UNIX Scripts .
Environment: Teradata 14.0, Informatica, Mainframe files, Oracle, PL/SQL.