Sr Etl Teradata/informatica Developer Resume
BostoN
SUMMARY:
- Around 9.6 years of IT experience in all phases of SDLC in DW/BI (Analysis, Design, Development, Test, Implementation and Operations).
- 9 years of experience in DataWarehousing using ETL and OLAP tools InformaticaPowerCenter9.5/9.1/8.x/7.x, Informatica PowerExchange9.x/8.x, OBIEE 11g/10g, Teradata, Informatica Data Quality 8.x Informatica Big Data Edition, Informatica Cloud.
- Worked on Various Domains Health care, Banking, Financial & Retail and extensively worked on Marketing Data.
- Expertise in creating databases, users, tables, triggers, macros, views, stored procedures, functions, Packages, joins and hash indexes in Teradata database.
- Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files.
- Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, Teradata SQL Assistant and Query man.
- Expertise in writing large/complex queries using SQL.
- Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics, Hints and SQL Trace both in Teradata as well as Oracle.
- Excellent experience in ETL Tools like Informatica and on implementing Slowly Changing Dimensions (SCD).
- Good knowledge in Unix Shell scripting.
- Excellent Experience with different indexes (PI, SI, JI, AJIs, PPI (MLPPI, SLPPI)) and Collect Statistics.
- In - depth hands on experience in database, ETL/ELT design and development and should have excellent data analysis skills
- Well versed with all stages of Software Development Life Cycle (SDLC) i.e. Requirement(s) gathering & analyzing, Design/redesign, Implementation and Testing.
- Good knowledge on Agile Methodology and the scrum process.
- Having complete domain and development Life Cycle knowledge of Data Warehousing & Client Server Concepts and knowledge of basic data modeling.
- Very good in Fast learning, Analytical thinking, decision making and problem solving skills.
- Have experience in working with both 3NF and dimensional models for data warehouse and good understanding of OLAP/OLTP systems.
- Proficient in preparing high/low level documents like design and functional specifications.
- Actively involved in Quality Processes and release management activities - To establish, monitor and streamline quality processes in the project.
TECHNICAL SKILLS:
Data warehousing: Informatica PowerCenter 9.5/9.1/8.6/8.5/8.1/7.1, Informatica Power Exchange 9.5/8.6/8.1,Informatica Big Data Edition, Teradata Utilities, FAST LOAD, MULTILOAD, FAST EXPORT,BTEQ,TPT, Informatica IDQ, 8.x,Informatica Power Center Visio 9.x,Informatica Metadata Manager
BI Tools: OBIEE 11g/10g
Databases: Oracle Exadata/11g/10g/9i, Sybase, Teradata13/12/V2R6, MS-Access, DB2 8.0/7.0, MS-SQL Server
Languages: XML, SQL, T-SQL, PL/SQL, UNIX Shell Scripting, Perl, Java
Operating System: HP-UX 11/10/9, IBM-AIX6.1/5.3, Sun Solaris 9/8/7, SCO-UNIX, LINUX, Windows XP Professional/95/98/2000/XP/2010/Vista/Windows 8
Other Tools: MS Visual Source Safe, ZENA, Autosys, Control M, uni center, Remedy, Clarity,TIDAL,JIRA,SVN Repository
DB Tools: SQL*Plus, SQL*Loader, Export/Import, TOAD, SQL Navigator, SQL Trace, MLOAD, FLOAD, FEXPORT, TPUMP
Microsoft Tools: MS Office, MS Front Page, MS Outlook, MS Project. MS Visio
PROFESSIONAL EXPERIENCE:
Sr ETL Teradata/Informatica Developer
Confidential
Roles and Responsibilities: -
- Responsible to develop the ETL Informatica Mappings to mask the target data.
- Responsible to analyse the Source data(Files) sent by the user prior to development and communicated to application team if required.
- Developed various applications including RxEnRoll, Valencia, EDW, EDW2, Retrodur, PreFazal, Fazal, Pace Verify, and AdvanceClinical (ADVC).
- Each application includes designing the mapping, session, worklets and workflows.
- Used Data Masking Transformation logics including Key, Expression and Substitution in order to mask the production data based on the requirement.
- Understanding of other masking logics like Random and Special Mask Formats.
- Working with various databases such as SQL Server, Oracle, DB2 and Teradata in various applications.
- Creating parameter file for each application.
- Developing shell scripts to automate the workflow for RxClaim, QL, and Recap and Mail-order applications .
- Performing Unit testing.
- Responsible for testing the Masked data and provide the Unit Test documents.
- Built the Customised functions and reusable objects to facilitate reusability of INFA objects
- Responsible to run and monitor the weekly load and fix the issues if the load fails.
- Worked on fixed width Source files extensively and generated the masked files accordingly.
- Loaded data into flat files/CSV file using the FAST EXPORT Utility.
- Worked on Data Masking using the Data Masking transformation in Informatica Power Centre 9.x.
- Responsible to fix the issues during the load failure due to MLOAD or Locks on the target tables.
- Worked on the data load requests requested by the user.
- Responsible for Updating the Data discovery documents and fixing the code accordingly when the user or application team confirms the new changes.
- Worked with Expression, Data Masking Transformation and Lookup transformations and stored Procedure so as to mask the Production data.
- Worked on HP Quality Centre to handle the defects raised for various applications.
- Worked on Performance tuning and reduced the data load times of few Weekly Jobs.
- Responsible to Code and implement the SCD Type 1 and Type 2 logics to load the Dimensions.
- Responsible for writing the BTEQ Scripts and automating the same using the INFA Mappings.
- Used transformations such as the source qualifier, normalizer, aggregators, connected & unconnected lookups, filters, sorter, stored procedures, router & sequence generator.
- Checked and tuned the performance of application
Environment Informatica Power Center 9.6, Oracle 11g, Teradata 14.0, Teradata SQL Assistant, UNIX,SQL, VSS, Outlook, Putty,DB2,SQL Server
Sr ETL Teradata Developer
Confidential, Boston
Roles & Responsibilities:
- Involved in effort estimation for the requirements in the project and prepared mapping documents based on client requirement specifications.
- Used File Bridge for flat file automatic DQ validations.
- Involved in developing SCD type 1, type 2 mappings in Informatica level and also involved in writing the stored procedure to perform type 1 and type 2 operations.
- Involved in the development of automated stored procedures to use as post and pre SQL in the Informatica session level which are useful to load the dimension and fact tables based on the table type.
- Used Explain Plan for Performance Optimisation of the BTEQ Queries.
- Analyzed the various bottlenecks at source, target, mapping and session level.
- Tuning of the mappings and SQL Scripts for a better performance.
- Designed the ETL processes using Informatica to load data from DB2, SQL Server and Flat files to the Target Database.
- Responsible for providing technical assistance for the design and execution of ETL projects to onshore and offshore developers.
- Responsibilities include creating the sessions and scheduling the sessions.
- Created various tasks to give various conditions in the workflows.
- Extensively created Re-usable Transformations and Mapplets to standardized Business logic.
- Mappings, Mapplets and Sessions for data loads and data cleansing. Enhancing the existing mappings where changes are made to the existing mappings using Informatica Power center.
- Involving in extracting the data from Oracle and Flat files. Developed and implemented various enhancements to the application in the form of Production and new production rollouts.
- Worked on identifying facts, dimensions and various other concepts of dimensional modeling which are used for data warehousing projects.
- Wrote, tested and implemented Teradata Fastload, Multiload,Fast Export and Bteq scripts, DML and DDL.
- Performance tuned and optimized various complex SQL queries.
- Responsible for writing UNIX Schell Scripts.
- Gathered system design requirements, design and write system specifications.
- Perform the unit testing as per the design document and as per the mapping design flow.
- Used the Volatile tables’ concept in writing the BTEQ Scripts .
- Created Mapplets to reduce the development time and complexity of mappings and better maintenance.
- Responsible for Unit Test Execution (Component and Assemble testing).
- Worked on preparing test cases for component testing, UAT Testing and fixed defects in all Environments.
Environment Informatica Power Center 9.6.1,Oracle 11g,JIRA,SVN,TIDAL Scheduler, Teradata 14, SQL Serevr,DB2, Teradata SQL Assistant, SQL, VSS, Outlook, Putty, MLOAD, TPUMP, FAST LOAD, FAST EXPORT, TDWM, PMON, DBQL
Sr ETL Informatica Developer
ConfidentialRoles & Responsibilities:
- Participated in all phases of the Development Life Cycle
- Worked on all the Unix/Informatica setups required for Informatica upgrade.
- Responsible for Requirements gathering,Technical System Design,Development,Testing,Code Review
- Code migration,UAT,Job scheduling.
- Actively participate in understanding business requirements,analysis and designing Data Migration/Integration Process from Business Analyst
- Interacted with Data Architecture group, PM, Project team to finalize the TSD design strategies. Created various diagrams as part of the Project deliverable and physical database design.
- Use Informatica PowerCenter and Unix Scripts through Control-M tool to extract, transform and load data
- Responsible for code migration, Code review, test plans, test scenarios, test cases as part of Unit/Integrations testing
- Automate the Production jobs using Informatica Workflows & Shell script through Control M Tool.
- Responsible for creating DDLS and granting previlleges to Users and Roles to the DB Objects.
- Responsible for syncing the ETL Code, Unix Script and DB objects to higher Environments.
Environment Informatica 9.5/9.6,Unix,Oracle,SQL,Quck Build, SVN Tortoise, JIRA,Control-M,TOAD, MS Office
Sr ETL Informatica Developer
Confidential, Denver
Roles and Responsibilities: -
- Responsible to develop the ETL Informatica Mappings to mask the target data.
- Responsible to analyse the Source data(Files) sent by the user prior to development and communicated to application team if required.
- Responsible for testing the Masked data and provide the Unit Test documents.
- Built the Customised functions and reusable objects to facilitate reusability of INFA objects
- Responsible to run and monitor the weekly load and fix the issues if the load fails.
- Worked on fixed width Source files extensively and generated the masked files accordingly.
- Used the Teradata Utilities, Fast Load, Multi and, BTEQ Queries to load the tables.
- Loaded data into flat files/CSV file using the FAST EXPORT Utility.
- Worked on Data Masking using the Data Masking transformation in Informatica Power Centre 9.x.
- Responsible to fix the issues during the load failure due to MLOAD or Locks on the target tables.
- Used the Volatile tables concept in writing the BTEQ Scripts .
- Worked on the data load requests requested by the user.
- Responsible for Updating the Data discovery documents and fixing the code accordingly when the user or application team confirms the new changes.
- Worked with Expression, Data Masking Transformation and Lookup transformations and stored Procedure so as to mask the Production data.
- Worked on HP Quality Centre to handle the defects raised for various applications.
- Worked on Performance tuning and reduced the data load times of few Weekly Jobs.
- Responsible to Code and implement the SCD Type 1 and Type 2 logics to load the Dimensions.
- Responsible for developing the Mappings using the Power canter and Teradata TPT Operator .
- Responsible for writing the BTEQ Scripts and automating the same using the INFA Mappings.
- Responsible to load the Detail and Summary Fact Tables in the Data Mart.
- Responsible to write the SQL’s to generate KPI Reports for end Users in OBIEE and Tableau.
- Have did the ETL Vs User Data Re-conciliation for the reports generated in OBIEE.
- Used transformations such as the source qualifier, normalizer, aggregators, connected & unconnected lookups, filters, sorter, stored procedures, router & sequence generator.
- Checked and tuned the performance of application
Sr ETL Teradata Developer
Confidential, Columbus
Responsibilities:
- Involved in Complete Software Development Lifecycle Experience (SDLC) from Business Analysis to Development, Testing, Deployment and Documentation.
- Used Teradata utilities fastload, multiload, tpump to load data
- Wrote BTEQ scripts to transform data
- Wrote Fastexport scripts to export data
- Wrote, tested and implemented Teradata Fastload, Multiload and Bteq scripts, DML and DDL.
- Constructed Korn shell driver routines (write, test and implement UNIX scripts)
- Wrote views based on user and/or reporting requirements.
- Wrote Teradata Macros and used various Teradata analytic functions.
- Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata.
- Performance tuned and optimized various complex SQL queries.
- Wrote many UNIX scripts.
- Good knowledge on Teradata Manager, TDWM, PMON, DBQL, SQL assistant and BTEQ.
- Gathered system design requirements, design and write system specifications.
- Excellent knowledge on ETL tools such as Informatica, SAP BODS to load data to Teradata by making various connections to load and extract data to and from Teradata efficiently.
- Agile team interaction.
- Worked on data warehouses with sizes from 30-50 Terabytes.
- Coordinated with the business analysts and developers to discuss issues in interpreting the requirements
Environment: Teradata R12/R13, Teradata SQL Assistant, SQL, VSS, Outlook, Putty, MLOAD, TPUMP, FAST LOAD, FAST EXPORT, TDWM, PMON, DBQL
Sr ETL Teradata/Informatica Developer
Confidential, New York City
Roles and Responsibilities: -
- Proficient in understanding business processes/requirements and translating them into technical requirements
- Worked with Source Analyzer, Target Analyser, Transformation designer, Mapping designer and Workflow Manager.
- Involved in data design and modeling by specifying the physical infrastructure, system study, design, and development by applying Ralph Kimball methodology of dimensional modeling
- Imported various Application Sources, created Targets and Transformations using Informatica Power Center Designer (Source analyzer, Transformation developer, Mapplet designer, and Mapping designer).
- Designed and developed Informatica Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
- Worked with various transformations to solve the Slowly Changing Dimensional Problems using Informatica Power Center
- Developed and scheduled Workflows using task developer, worklet designer, and workflow designer in Workflow manager and monitored the results in Workflow monitor.
- Did performance tuning at source, transformation, target, and workflow levels.
- Used PMCMD, and UNIX shell scripts for workflow automation.
- Parameterized the mappings and increased the re-usability
- Used transformations such as the source qualifier, normalizer, aggregators, connected & unconnected lookups, filters, sorter, stored procedures, router & sequence generator.
- Checked and tuned the performance of application.
- Created and defined DDL’s for the tables at staging area.
- Responsible for creating shared and reusable objects in Informatica shared folder and update the objects with the new requirements and changes.
- Documented user requirements, translated requirements into system solutions and develop implementation plan and schedule.
- Migration between Development, Test and Production Repositories.
- Involved in creating and management of global and local repositories and assigning permissions using Repository Manager. Also migrated repositories between development, testing and production systems.
- Did unit test and development testing at ETL level in my mappings.
- Involved in UAT Support.
- Involved in Peer reviews of LLDs, mappings, Sessions and PWX data maps.
- Monitoring currently running jobs.
Environment: Informatica Power Center 9.1.0/9.5, Workflow Manager, Workflow Monitor, PL/SQL, SQL, Oracle 11g/10 g, Toad 10.6, Oracle 11.6.
Sr ETL Teradata/Informatica Developer
Confidential, Jersey City, NJ
Roles and Responsibilities: -
- Involved in design , development and maintenance of database for Data warehouse project.
- Involved in Business Users Meetings to understand their requirements.
- Converted business requirements into technical documents - BRD, explained business requirements in terms of technology to the developers.
- Developed ETL Mappings and Test plans.
- The Data flow diagrams ranged from OLTP systems to staging to Data warehouse.
- Developed Test plans to verify the logic of every Mapping in a Session. The test plans included counts verification, look up hits, transformation of each element of data, filters, and aggregation and target counts.
- Developed Complex Informatica Mappings using various transformations- Source Qualifier, Normalizer, Filter, Connected Lookup, Unconnected Lookup, Update strategy, Router, Aggregator, Sequence Generator, Reusable sequence generator transformation.
- Extensively used SCD’s (Slowly Changing Dimension) to handle the Incremental Loading for Dimension tables, Fact tables.
- Designed various Mappings for extracting data from various sources involving Flat files, Oracle and SQL Server, IBM DB2.
- Used Teradata utilities fastload, multiload, tpump to load data
- Wrote BTEQ scripts to transform data
- Wrote Fastexport scripts to export data
- Worked on Debugging and Troubleshooting of the Informatica application. For debugging utilized Informatica debugger.
- Worked on Performance Tuning to optimize the Session performance by utilizing, Partitioning, Push down optimization, pre and post stored procedures to drop and build constraints.
- Worked on Teradata utilities BTEQ, MLOAD, FLAOD and TPUMP to load staging area.
- Created UNIX Script for ETL jobs, session log cleanup and dynamic parameter.
- Created and scheduled Sessions, Jobs based on demand, run on time and run only once
- Monitored Workflows and Sessions using Workflow Monitor and Scheduler alert editor.
- Performed Unit testing , Integration testing and System testing of Informatica mappings.
Environment: Informatica PowerCenter Cloud 9.5/9.1, Informatica Power Exchange 9.1, OracleExadata, Teradata13, MSSQL Server 2008 R2, T-SQL, Erwin 8, SQL Server 2008, TOAD, PL/SQL Developer, Linux, Tidal, Shell Scripting, Perl Scripting, Windows XP, Putty, DB2 Mainframe, OBIEE 11g, Informatica Cloud, Salesforce.
Teradata/InformaticaDeveloper
Confidential, Bridgewater, NJ
Roles and Responsibilities: -
- Involved in Requirement Gathering and Business Analysis and created tech specs for ETL process.
- Developed data Mappings between source systems and warehouse components using Mapping Designer
- Setup folders, groups, users, and permissions and performed Repository administration using Repository Manager.
- Design , development and implementation . It includes the several modifications, enhancement and maintenance of existing software.
- Build the Dimension & Facts tables load process and reporting process using Informatica and OBIEE.
- Consolidate sales order data using Informatica and loaded to aggregate tables for reporting.
- Coordinated with Onsite and Offshore team.
- Developed various complex Mapplets and Stored procedure to facilitate loading of data on weekly and monthly basis.
- Worked on Various transformations such as Joiner, Filter, Update Strategy, Expression, Router, Lookup, Union, and Aggregator.
- Administered and developed monitoring scripts for power center server process and workflow jobs.
- Designed and developed enhanced data models and ETL applications to meet the business requirement.
- Mapping type 1 & type 2 Slowly Changing Dimension (SCD) using Informatica.
- Timely delivery of ETL tasks is performed using workflow manager and scheduler.
- Worked closely with DBA and developers during planning, analyzing, developing and testing phase of the project.
- Involved in Designing of Data Modeling for the Data warehouse.
- Involved in the performance tuning of the Informatica mappings and the sequel queries inside the source qualifier.
- Involved in the Performance Tuning of Database and Informatica. Improved performance by identifying and rectifying the performance bottle necks.
- Involved in user acceptance testing. Defect reporting & tracking in Jira and resolved them.
Environment: Informatica Power Center 8.6.1, Informatica Power Exchange 9.1, MS OFFICE/VISIO 2007, Oracle Exadata, DB2Mainframe, UNIX, AIX, Rally, MS Series, Web Services, Java, Tidal, Putty, Basecamp, Erwin 7.5, PL/SQL
ETL/ Informatica Developer
Confidential
Roles and Responsibilities: -
- Designed and developed complex mappings using various transformations in Designer to extract the data from sources like Oracle, SQL Server and flat files to perform mappings based on company requirements and load into Oracle tables.
- Extensively worked on various source like flat files, Databases, XML andWeb services.
- Created and alteredTables, Triggers, Sequences, and other DDL, DML, DCL and utilities of Oracle.
- Extensively worked in the performance tuning of ETL procedures and processes.
- Performance Tuned Informatica Targets, Sources, mappings & sessions for large data files by increasing data cache size, sequence buffer length and target based commit interval.
- Any issues with the daily running jobs, find out the errors and fix them within time limit
- Used Partition Primary Index concept to increase the performance of loading and collected stats on tables and indexes.
- Mainly responsible for solving Informatica and Database Issues for different departments if ticket is opened.
- Any problem with the ETL codes, prime responsibility was to test in Integration Environment before loading in Production.
- Used Debugger utility of the Designer tool to check the errors in the mapping and made appropriate changes in the mappings to generate the required results for various tickets.
- Used Teradata Utilities Fast Load and Multi Load to load data into the Tables.
- Worked on Fast Export to load data into csv files, responsible to build the ETL load using the TPT Operator.
- Wrote Analytical queries to generate report, designed normal and materialized View.
- Optimized mappings having transformation features like Aggregator, filter, Joiner, Expression and Lookups
- Created daily and weekly workflows and scheduled to run based on business needs
Environment: : Informatica PowerCenter 8.6, Oracle 10g, Teradata V2R5, XML, TOAD, SQL, PL/SQL, IBM AIX, UNIX Shell Scripts, Web Intelligence, DSBASIC, Erwin 6.1, Remedy, Maestro job scheduler, Mercury Quality Center, Teradata
Informatica /Developer
Confidential
Roles and Responsibilities: -
- Collection of requirements from business users and analyzed based on the requirements.
- Extensively used flat files for Designed and developedcomplexInformaticamappings using expressions, aggregators, filters, lookup and stored procedures to ensure movement of the data between various applications.
- Designed and developed complex mappings using various transformations in Designer to extract the data from sources like Oracle, SQL Server and flat files to perform mappings based on company requirements and load into Oracle tables.
- Well versed with creating and Altering Table, Triggers, Sequences, and other DDL, DML and DCL utilities of Oracle 9i and 10g.
- Error checking & testing of the ETL procedures & programs using Informatica session log.
- Performance Tuned Informatica Targets, Sources, mappings & sessions for large data files by increasing data cache size, sequence buffer length and target based commit interval.
- Performed error handling for removing database exceptional errors.
- Used FAST LOAD, MULTI LOAD and TPT Operators to load data into the Staging tables.
- Used Explain Plan to identify the Bottlenecks for Performance Optimisation .
- Configured Informatica Power Exchange connection and navigator.
- Created Registration, Data Map, configured Real-Time mapping and workflows for real-time data processing using CDC option of Informatica Power Exchange.
- Used Debugger utility of the Designer tool to check the errors in the mapping and made appropriate changes in the mappings to generate the required results.
- Worked with tools like TOAD to write queries and generate the result.
- Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the Informatica mappings. Involving also in the functional and technical design document sessions with Technical team and business users team.
- Actively involved in acquiring training on the Portico 8.0/9.0 versions of Provider Manager and Business Administrator modules of Portico.
- Working on the Data Analyzer, Data Quality and data profiler tool for handling the business logic and exception logic from the inbound feeds and also the outbound feeds.
Environment: Informatica PowerCenter 8.1, Informatica Power Exchange 8.1, DB2 Mainframe, Cobol, ORACLE 10g, UNIX, Windows NT 4.0, Sybase, UNIX Shell Programming, PL/SQL, TOAD, Putty, WinScp, Remedy , Teradata