We provide IT Staff Augmentation Services!

Etl/ Teradata Developer Resume

5.00/5 (Submit Your Rating)

White Plains, NY

PROFESSIONAL SUMMARY:

  • Around 8 years of experience in Information Technology with a strong background in Database development and Data warehousing and ETL process using Informatica Power Center 9.x/8.x/ 7.1.3/7.1.1/6.2. Repository Manager, Repository Server, Workflow Manager & Workflow Monitor.
  • Proficiency in developing SQL with various relational databases like Oracle, SQL Server.
  • Knowledge in Full Life Cycle development of Data Warehousing.
  • Knowledge of IDQ 9.6.1 standards, guidelines and best practices
  • Worked with Informatica Data Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 8.6.1
  • Strong understanding of OLAP and OLTP Concepts.
  • Understand the business rules completely based on High Level document specifications and implements the data transformation methodologies.
  • Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.
  • Proficient in Oracle Tools and Utilities such as TOAD and SQL*Loader.
  • Extensive knowledge working with Metadata .
  • Writing Teradata sql queries to join or any modifications in the table
  • Developed complex mapping using Informatica Power Center Cloud. Used Teradata utilities fastload, multiload, tpump to load data
  • Writing Teradata BTEQ scripts to implement the business logic
  • Creation of customized Mload scripts on UNIX platform for Teradata load
  • Strong hands on experience using Teradata utilities (SQL, B - TEQ, Fast Load, Multiload, Fast Export, Tpump, Visual Explain, Query man), Teradata parallel support and Unix Shell scripting.
  • Proficient in coding of optimized Teradata batch processing scripts for data transformation, aggregation and load using BTEQ.
  • Highly skilled IT professional with 2 years and 4 month of experience in Informatica & SAPBusiness Objects
  • Experience with AGILE Methodologies.
  • Experienced in the use of agile approaches, including Extreme Programming, Test-Driven Development and Scrum.
  • Experience in various stages of System Development Life Cycle(SDLC) and its approaches like Agile, and Prototyping Model.
  • Knowledge on SAP BO Xcelsius dashboard reports, Webi Reporting
  • Experience in SQL performance tuning.
  • Experience in maintaining Data Concurrency, Replication of data.
  • Good knowledge on BI tools like SQLBI and OBI.
  • Thorough Knowledge in creating DDL, DML and Transaction queries in SQL for Oracle database.
  • Performed SAP Data Migration by using Business Objects Data Services as the ETL tool.
  • Design, Architect BizTalk EDI Interfaces and HIPAA EDI Frame-work components
  • Implemented Encryption and Secure Transfer of 834 EDI Enrollment Transactions using BizTalk Adapter
  • Experience in using SQL developer, SQL*Plus, Procedures/Functions and Performance tuning.
  • Expertise in RDBMS concepts, with hands on exposure in the development of relational database environment using SQL, PL/SQL, Cursors, Stored Procedures, Functions and Triggers.
  • Experience in working with Perl Scripts for handling data coming in Flat files.
  • Experience working with UNIX Shell Scripts for automatically running sessions, aborting sessions and creating parameter files.
  • Strong with relational database design concepts.
  • Expertise in design and development of the three layers (Physical/Business Model and Mapping/ Presentation) of an OBIEE Metadata Repository (. rpd) using Oracle
  • Extracted data from various heterogeneous sources like Oracle and SAP .
  • Experienced in configuring and set up of OBIEE Security using LDAP and External Database Tables and configuring object level and database level security.
  • Expertise in Building Executive Dashboards using Dashboard Manager for Siebel Analytics Web / Oracle Presentation Services for Siebel Analytics / OBIEE .
  • Performed data validation by Unit testing, integration testing and System Testing.
  • Extensive experience in managing teams/On Shore-Off Shore Coordination/Requirement Analysis/Code reviews/Implementing Standards.
  • Good experience with Informatica Performance Tuning.
  • Good knowledge of Data modeling techniques like Dimensional/ Star Schema, Snowflake modeling, slowly changing Dimensions(SCD) using Erwin.
  • Flexible, enthusiastic and project oriented team player with solid communication and leadership skills to develop creative solution for challenging client needs.
  • Experience in Informatica PowerCenter and implementation experience in B2B Data Transformation (DT) and B2B Data Exchange (DX)
  • Excellent interpersonal and communication skills, and is experienced in working with senior level managers, business people and developers across multiple disciplines.
  • Able to work independently and collaborate proactively & cross functionally within a team.
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 9.6/9.5/9.1, 8.6.1/8.5/8.1.1, Power Exchange 9.6/9.5/9.1/8.6, Informatica IDQ.

Databases: Teradata V15, Oracle 12g/11g/10g, MS SQL Server 2008/2012, MS Access, DB2, Teradata 14/13/12/V2R6, V2R5, Sybase, Greenplum

Operating Systems: UNIX, Linux, Windows 2003/2000/NT, Windows 7/8, AIX, Sun Solaris

Languages: C, C++, SQL, PL/SQL, HTML, PHP, JAVA, UNIX Scripting

Reporting Tools: Business Objects XIR2, SAP BI 7.0, OBIEE 12g/11g, Power-BI

Other tools: Toad, Harvest, SCM, Putty, Tidal, Autosys, ESP, Informatica Hierarchy Manager (HM). Sql Assistance

PROFESSIONAL EXPERIENCE:

Confidential, White Plains, NY

ETL/ Teradata Developer

Responsibilities:

  • Involved in client meetings on daily basis for understanding and analyzing the business requirements.
  • Worked with BTEQ in UNIX environment and execute the TPT script from UNIX platform.
  • Used SQL Assistant to querying Teradata tables.
  • Worked with complex SQL queries to test the data generated by ETL process against target data base.
  • Wrote numerous BTEQ scripts to run complex queries on the Teradata database.
  • Used volatile table and derived queries for breaking up complex queries into simpler queries.
  • Created a cleanup process for removing all the Intermediate temp files that were used prior to the loading process. Stream lined the Teradata scripts and shell scripts migration process on the UNIX box.
  • Wrote the incremental logic for the dimensional tables to run Incremental Load in BTEQ scripts.
  • Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files
  • Expertise in creating databases, users, tables, triggers, macros, views, stored procedures, functions, Packages, joins and hash indexes in Teradata database.
  • Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics, Hints and SQL Trace both in Teradata as well as Oracle.
  • Wrote Teradata Macros and used various Teradata analytic functions . Monitoring the Teradata queries using Teradata View Point.
  • Creating and maintaining source-target mapping documents for ETL development team.
  • Scheduled the Bteq scripts using Control-M tool. Writing shell scripts to get the collect stats from Control-M jobs.
  • Experience in helping/giving queries to POWER-BI reporting team to get the reports.
  • Extensively worked on Facts and Slowly Changing Dimension (SCD) tables.
  • Maintained source and target mappings, transformation logic and processes to reflect the changing business environment over time.
  • Worked on different tasks in Workflow Manager like Sessions, Events raise, Event wait, Decision, E-mail, Command, Worklets, Assignment, Timer and Scheduling of the workflow.
  • Used various transformations like Filter, Router, Expression, Lookup (connected and unconnected), Aggregator, Sequence Generator, Update Strategy, Joiner, Normalizer, Sorter and Union to develop robust mappings in the Informatica Designer.
  • Extensively used workflow variables, mapping parameters and mapping variables.
  • Created sessions, batches for incremental load into staging tables and scheduled them to run daily.

Environment: Teradata V15, Teradata SQL Assistant, Informatica Power Center 9.1, BTEQ, MLOAD, FLOAD, FAST EX-PORT, UNIX Shell Scripts, Control-M scheduling Tool, Power-BI.

Confidential, CA

ETL/ Informatica Developer/ Production Support

Responsibilities:

  • Involved in client meetings on daily basis for understanding and analyzing the business requirements.
  • Coordinating with client and offshore to complete work on time.
  • Regular touch base with infa admins, Gp admins, Autosys team.
  • Used Informatica data quality (IDQ) in cleaning and formatting customer master data.
  • Attending weekly meetings with UDW L3 team, EDH L3 team and with offshore.
  • Attending KT sessions for new deployments to prod
  • Working on Queries raised by Offshore and helping them to solve the issues to maintain smooth execution of batch process.
  • Involved in complete life cycle implementation of SAP BI / BO.
  • Working on creating new approaches for enhancing the regular process
  • Attending adhoc meetings
  • Responding to adhoc queries raised by upstream/downstream/application teams.
  • Helping offshore team in testing code changes and reviewing them.
  • Handling the logon issue with SAP BO and DS.
  • Deploying the changes tested to prod by raising CM.
  • Creating the users and granting access in SAP BODS and BO.
  • Preparing the technical documents (like design documents, HLD’s, LLD’s) from business requirement documents.
  • Prepare test strategies and execution plan. Monitoring progress of work.
  • Extensively developed informatica mappings, mapplets and workflows.
  • Used informatica Workflow manager to develop, schedule sessions and send pre-and post-session emails to communicate success or failure of session execution.
  • Extensively used Greenplum SQL
  • Maintenance existing GL application.

Environment: Informatica Power Center 9.6.1, Oracle 11g, Linux, Autosys, Toad, Greenplum, Wherescape Red, Powerexchange.

Confidential, CA

ETL/ Informatica Developer

Responsibilities:

  • Extensively used Informatica Client tools - Power Center Designer, Workflow Manager, Workflow Monitor and Repository Manager
  • Extracting data from Oracle and Flat file, Excel files, Relational, XML and Web services and performed complex joiner, Expression, Aggregate, Lookup, Stored procedure, Filter, Router transformations and Update strategy transformations to load data into the target systems.
  • Used concept of staged mapping to perform asynchronous Web Services request and response as part of Informatica Mappings.
  • Extracted the data from various data source systems into the Landing Zone area by creating the Informatica mappings using the Teradata fast Loader Connections.
  • Designed Mappings using B2B Data Transformation Studio.
  • Good experience with IDQ analyst tool and Developer tool.
  • Good experience in Glossary and Rule specifications.
  • Good experience in creating connections in IDQ tool like standardizer, Labeller, Match and Merge.
  • Good experience in creating mapplets for Powercenter in IDQ tool.
  • Configured Address doctor content on both PC and IDQ servers and helped users in building scenarios
  • Extensively used Informatica Client tools - Power Center Designer Cloud, Work flow Manager, Work flow Monitor and Repository Manager
  • Wrote BTEQ scripts to transform data
  • Wrote Fastexport scripts to export data
  • Wrote, tested and implemented Teradata Fastload, Multiload and Bteq scripts, DML and DDL.
  • Created Sessions, Tasks, Workflows and worklets using Workflow manager.
  • Worked with Data modeler in developing STAR Schemas
  • Experience in converting SAS scripts to Informatica.
  • Worked on utilities like FLOAD, MLOAD, FEXP of Teradata and created batch jobs using BTEQ.
  • Developed complex mapping using Informatica Power Center Cloud.
  • Created Sessions, Tasks, Workflows and worklets using Workflow manager.
  • Extracting data from Oracle and XML, SAP, MS SQL, T-SQL and performed Delta mechanism using Expression, Aggregate, Lookup, Stored procedure, Filter, Router transformations and Update strategy transformations to load data into the target systems.
  • Responsible for Configuration and administration of Meta data Manager.
  • Used TOAD, SQL Developer to develop and debug procedures and packages.
  • Involved in developing the Deployment groups for deploying the code between various environment (Dev, QA).
  • Extensively Used Change Data Capture (CDC) to simplify ETL in data warehouse applications.
  • Experienced with Informatica PowerExchange (8.x) for Loading/Retrieving data from mainframe systems.
  • Created pre sql and post sql scripts which need to be run at Informatica level.
  • Worked extensively with session parameters, Mapping Parameters, Mapping Variables and Parameter files for Incremental Loading.
  • Extracted data from various heterogeneous sources like Oracle, SQL Server, XML, XSD’s, Flat Files, Teradata.
  • Extensively worked with various Lookup caches like Static cache, Dynamic cache and Persistent cache.
  • Developed Slowly Changing Dimension Mappings for Type 1 SCD and Type 2 SCD
  • Monitored and improved query performance by creating views, indexes, hints and sub queries
  • Extensively involved in enhancing and managing Unix Shell Scripts.
  • Developed workflow dependency in Informatica using Event Wait Task, Command Wait.
  • Created mappings by cleansing the data and populate that into Staging tables, populating the staging to Archive and then to Enterprise Data Warehouse by transforming the data into business needs & Populating the Data Mart with only required information

Environment: Informatica Power Center 10.1,9.6.1, Oracle 11g, Linux, Autosys, Toad, Greenplum

Confidential, CA

ETL/ Informatica Developer

Responsibilities:

  • Expertise in UNIX shell scripting, FTP, SFTP and file management in various UNIX environments.
  • Used Informatica Power Center 9.5 for Extraction, Transformation and Loading data from heterogeneous source systems into the target data base.
  • Experienced in the use of agile approaches, including Extreme Programming, Test-Driven Development and Scrum
  • Extensive experience with Healthcare, Energy and Media domains.
  • Experience with the healthcare data in HIPPA ANSI X12, 4010, 5010 formats including NDC, DRG, CPT, NCPDP, NSF code, ICD 10, ICD 9, 997,837,834,835,277,271,270
  • Extensive knowledge on the Inbound and Outbound HIPAA transaction sets with both 4010A and 5010A HIPAA versions.
  • Expertise in working with direct submitters, clearing houses and providers in regards to all the issues with EDI setup, claim submissions, eligibility transactions and ERA’s.
  • Researched the existing client processes and guided the Vendors in aligning the HIPAA compliance output with the client systems.
  • Worked on Claims database for claims processing.
  • Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files.
  • Wrote Teradata Macros and used various Teradata analytic functions.
  • Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata
  • Worked on patient ally applications for claim processing.
  • Testing web services API publication and subscription
  • Highly motivated and goal-oriented individual with a strong background in SDLC Project Management and Resource Planning using AGILE methodologies.
  • Created Informatica maps using various transformations like SAP BAPI/RFC, SAP IDOCs transformations, Web services consumer, XML, HTTP transformation, Source Qualifier, Expression, Look up, Stored procedure, Aggregate, Update Strategy, Joiner, Union, Filter and Router. In EDW we load the consignment related data based business rules, this data we extract from SAP source system.
  • Extensively worked on Power Center Designer, Workflow Manager.
  • Involved in extracting data from ERPs like SAP, JD ENTERPRISE 1 into EDW (Oracle) and developed Transformation logic using Informatica.
  • Historical capture of measures and metrics to enable trending and analysis across multiple HEDIS cycle to be performed
  • Created all HEDIS reporting areas legacy.
  • Generated annual HEDIS NCQA/ Medical Record review processes for accreditation
  • Worked closely with Accenture load team for Data load into SAP (Direct Load Method/ LSMW/ BAPI)
  • Developed ETL technical specs, ETL execution plan, Test cases, Test scripts etc.
  • Created ETL mappings and mapplets to extract data from ERPs like SAP, ENTERPRISE 1 (Oracle) and load into EDW (Oracle 10g)
  • Worked with the business to cleanse the data for Human Resources objects to help better cleanse data that can be loaded on SAP.
  • Created the ETL Exception reports, SAP Exception reports, and validation reports after the data is loaded into SAP
  • Once the application is gone live, created the Reconciliation reports to make sure that the necessary data is loaded into SAP without fail.
  • Experience working with Business users and architects to understand the requirements and to pace up the process in meeting the milestone.
  • Wrote Teradata Macros and used various Teradata analytic functions.
  • Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata.
  • Performance tuned and optimized various complex SQL queries.
  • Created Pre/Post Session and SQL commands in sessions and mappings on the target instance.
  • Responsible for Performance tuning at various levels during the development.
  • Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
  • Used stored procedures to create a standard Time dimension, drop and create indexes before and after loading data into the targets for better performance.
  • Developed shell scripts, PL/SQL procedures, for creating/dropping of table and indexes of performance for pre-and post-session management.
  • Responsible for migrating the workflows from development to production environment.
  • Worked using Parameter Files, Mapping Variables, and Mapping Parameters for Incremental loading.
  • Managed post production issues and delivered all assignments/projects within specified time lines.

Environment: Informatica Power center 9.6.1/9.5/8.6.1, Oracle 12/11g, SQL server 2008, LINUX, LSF (Job scheduling), UNIX shell scripting, SAS Enterprise guide and MS Visual studio (TFS) for version control, IDQ, DIH, Toad, RDBMS.

Confidential, Richmond, VA

Informatica Developer

Responsibilities:

  • Gathered requirements from the end users and Involved in analysis of source systems, business requirements and identification of business rules.
  • Involved in the Data Warehouse Data modeling based on the client requirement.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center.
  • Extracted sources from flat-files, Oracle, SQL Server and load them into Oracle.
  • Responsible for creating system design and detail design documents based on the requirement document provided by the business users.
  • Provides strategic thinking, leadership pertaining to new ways of leveraging information to improve business processes.
  • Experienced in database design, data analysis, development, SQL performance tuning, data warehousing ETL process and data conversions.
  • Based on the logic, developed various Mappings & Mapplets to load data from various sources using different transformations like Source Qualifier, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner in the mapping. Also, developed Error Processing to capture the error records and loads them into Message Log table.
  • Worked with Stored Procedure Transformation for time zone conversions.
  • Created UNIX scripts to automate the activities like start, stop, and abort the Informatica workflows by using PMCMD command in it.
  • Provided production support including error handling and validation of mappings.
  • Addressed and track requests for system enhancements, improvements from end users/customer and resolved production issues.
  • Extensively used Debugger Process to modify data and applying Break Points while Session is running.
  • Used various Informatica Error handling techniques to debug failed session.
  • Responsible for migrating the folders or mappings and sessions from development to test environment and Created Migration Documents to move the code from one Environment to another Environment.

Environment: Informatica 9.1, UNIX, Windows, Oracle 10g, PL/SQL, MSSQL 2008R2, SQL, Flat Files and CSV, Shell Scripting, Putty, PL/SQL Developer

Confidential

Application ETL Developer

Responsibilities:

  • Responsible for design and development of Sales Data Warehouse.
  • Worked with Business Analyst and application users to finalize Data Model, functional and detailed technical requirements.
  • Extracted data from heterogeneous sources like Oracle, SQL Server
  • Created detailed Technical specifications for Data Warehouse and ETL processes.
  • Conducted a series of discussions with team members to convert Business rules into Informatica mappings.
  • Used Transformations like Look up, Router, Filter, Joiner, Stored Procedure, Source Qualifier, Aggregator and Update Strategy extensively. Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval. Involved in doing error handling, debugging and troubleshooting Sessions using the Session logs, Debugger and Workflow Monitor.
  • Monitoring the Data Quality, generating weekly/monthly/yearly statistics reports on production processes - success / failure rates for causal analysis as maintenance part and Enhancing exiting production ETL scripts.
  • Worked with SAP and Oracle sources to process the data.
  • Worked on SAP for data migration Human Resources and Finance and converted various objects on Organizational Structure, Addresses, Time, Basic Pay, Bank Details, Recurring Payments, Tax assignment, Insurance Plans, Payroll etc., to generate report from SAP BI system.
  • Worked with Pre-and Post-Session SQL commands to drop and recreate the indexes on data warehouse using Source Qualifier Transformation of Informatica Power center.
  • Created Unix Shell Scripts to automate sessions and cleansing the source data.
  • Implemented pipeline partitioning concepts like Round-Robin, Key-Range and Pass Through techniques in mapping transformations.
  • Involved in Debugging and Performance tuning of targets, sources, mappings and sessions.
  • Worked on the data masking activities for cloning the GDW for several buyers

Environment: Informatica Power Center 8.6, Oracle9i, Teradata v2r6, SAP, SAP BI 7.0, SQL Server, Sun Solaris.

Confidential

ETL Developer

Responsibilities:

  • Interacted with Business Analysts for Requirement gathering, understanding the Requirements, Explanation of technical probabilities and Application flow.
  • Developed ETL mappings, transformations using Informatica Power Center 8.6.1
  • Extracted data from flat files, DB2 and loaded the data into Oracle staging using Informatica Power Center.
  • Designed and created complex source to target mapping using various transformations inclusive of but not limited to Sorter, Aggregator, Joiner, Filter, Source Qualifier, Expression and Router Transformations.
  • Extensively used Lookup Transformation and Update Strategy Transformation while working with Slowly Changing Dimensions (SCD) and performed reading and loading high-volume Type 2 dimensions.
  • Created various transformations such as Union, Aggregator, Lookup, Update Strategy, Joiner, Filter and Router Transformations.
  • Involved in Performance tuning. Identified and eliminated bottlenecks (source, target, and mapping).
  • Designed various tasks using Informatica workflow manager like session, command, email, event raise, event wait and so on.
  • Designed and developed UNIX shell scripts as part of the ETL process to automate the data load processes to target.
  • Developed Mappings that extract data form ODS to Data mart and Monitored the Daily, Weekly and Monthly Loads.
  • Created and monitored Sessions/Batches using Informatica Server Manager/Workflow Monitor to load data into target Oracle database

Environment: Informatica PowerCenter8.1, SQL server 2005, DB2, PL/SQL, Toad, Business Objects 6.0, Windows NT, UNIX Shell Scripts.

We'd love your feedback!