We provide IT Staff Augmentation Services!

Senior Teradata/etl Informatica Power Center Developer Resume

2.00/5 (Submit Your Rating)

Grand Rapids, MI

SUMMARY:

  • 8 + years of experience in Teradata 15, 14, 13, 12, SQL, Informatica, Data Warehouse Modeling, Teradata Utilities, Aggregates and building efficient views.
  • Hands on experience in Teradata SQL, BTEQ Scripting and Informatica Power Center components - Designer, Workflow Manager, Workflow Monitor and Repository Manager.
  • Expertise in Oracle … SQL, PL/SQL, Oracle Forms 11g,10g/9i/6i, Reports10g/ 9i/6i and Linux, Unix shell scripting.
  • Good Exposure to Teradata SQL and BTEQ Scripting and Informatica in Telecom and Health care domains.
  • Part of an agile development team supported health insurance applications for subsidiary Priority Health.
  • Create Unix shell scripts for automating monthly grouper runs for processing claims using scheduling tools like master job scheduler.
  • Developed Framework scripts using unix and python shell to be used by all teams.
  • Supported multiple teams in resolving Framework setup/configuration related issues
  • Proficient in Database Sizing, Capacity Planning, Database Performance Monitoring, Database Backup/Recovery and SQL Query Tuning.
  • Good exposure to Data Warehousing applications, directly responsible for the Extraction Transformation and Loading (ETL) data of multiple formats into Data Warehouse.
  • Worked on Hive HQL scripts and created external tables in Hive.
  • Experienced in Oracle … systems, SQL*Loader, Export, Import and SQL*Plus.
  • Wrote complex SQL queries and PL/SQL procedures to extract data from various source tables.
  • Experience in writing Stored Procedures, Functions, Database Triggers and Packages using PL/SQL.
  • Enforcing data integrity using integrity constraints & database triggers.
  • Proficient in Oracle Data base design with relational models and very good experience in entity and attribute identification and developing the Entity-Relationships.
  • Strong proficiency in maintaining data warehousing applications using Informatica Power Center.
  • Extensively worked on BTEQ and having good knowledge on utilities including Multi Load, Fast Export, Fast Load, T-pump and Teradata Parallel Transporter.
  • Used UNIX shell scripts for automating tasks for BTEQ and other utilities.
  • Familiar in Creating Secondary indexes, and join indexes.
  • Experienced in handling Relational Data Modeling and Dimensional Data Modeling.
  • Secondary Skill set includes Informatica/Podium.
  • Hands on experience in developing scalable enterprise applications to the user needs, which serves for Finance, Telecom, Insurance and Retail domains.
  • Experienced in gathering system design requirements, design and write system specifications.
  • Good team player with excellent technical and interpersonal skills.
  • Pro-active, self-motivated and able to work independently as well as in team.
  • Excellent Documentation and Process Management skills with an ability to effectively understand the business requirements to develop a quality product.
  • Involved in code review process for designing standards to be used by other teams.
  • Strong problem solving, analytical, interpersonal, written and oral communication skills and have the ability to work both independently and as a team.

TECHNICAL SKILLS:

Primary Tools:: Teradata SQL, Teradata tools &Utilities, Informatica power center

Utilities:: Teradata SQL Assistant, BTEQ, FastLoad, MutliLoad, Fast Export, Tpump, Teradata Parallel Transporter.

Languages:: Teradata SQL, C, PL SQL, Oracle.

Databases:: TeradataV2R6.x/ 12.0/13.0/14.0, Oracle

Operating Systems:: Windows95/98/NT/2000/XP, UNIX, Linux

Data Modeling:: Erwin4.0/3.5, Logical/Physical/Dimensional/3NF, Star/ ETL, OLAP

Scripting Languages:: UNIX Shell Scripting, BTEQ.

PROFESSIONAL EXPERIENCE:

Confidential, Grand Rapids MI

Senior Teradata/ETL Informatica Power Center Developer

Responsibilities:

  • Subject area is Insurance Claims processing/Groupers automation and integration of data into Data Warehouse.
  • Design ETL processes for optimal performance and develop ETL Informatica Mappings to extract data from different systems and load data in to Oracle Database.
  • Extensively using the Teradata utilities like BTEQ scripts, MLOAD and FLOAD scripts to load the huge volume of data.
  • Responsible for analyzing, designing and helping team to validate the developed ETL code.
  • Create Unix Shell Scripts for automation of Jobs using Rundeck and invoking Groupers monthly run for processing claims for certain volume of data.
  • Also written windows shell scripts for purging files on server for retaining space on server.
  • Created complex Informatica mappings, re-usable transformations and prepared various mappings to load the data into different stages like landing, staging and target Working with cross-functional teams to resolve the issues.
  • Create various user defined exceptions to control the process based on the business requirement.
  • Involved in generating numbers for primary key values using Oracle sequence objects.
  • Prepared various mappings to load the data into different stages like Landing, Staging and Target tables.
  • Created various tasks like Session, Command and also Modified several existing mappings based on the user requirements and maintained existing mappings, sessions and workflows.
  • Tuned the performance of mappings by following Informatica best practices and also applied several methods to get best performance by decreasing the run time of workflows.
  • Participate in all Agile ceremonies for Scrum including PI planning, sprint execution, Backlog grooming, Retrospectives, Innovation and Demonstration.
  • Contribute to architecture planning and mentor new team members.
  • Responsible for the development of ETL and reporting solutions to address business needs.

Environment: Teradata 15, Teradata SQL Assistance, Teradata Utilities, M load, Fast load, Fast export, Teradata Parallel Transporter (TPT), BTEQ, Teradata SQL Assistant, Oracle Database 12c/Oracle11i, Informatica Power Center9.6/9.1, Windows, UNIX, Linux, Master Job scheduler, Bit Bucket, Run deck, GIT, XI R2, Collibra.

Confidential,Southfield MI

Sr ETL Developer

Responsibilities:

  • Involved in ETL code to Extract, Transformation, cleansing and loading data from source to target data structure.
  • Working experience on queries tuning by rewriting queries to improve the performance.
  • Experience in writing Stored Procedures, Functions, Database Triggers and Packages using PL/SQL.
  • Good experience on external tables to load the data from flat files to oracle table.
  • Worked on collections, triggers, functions, packages, views, materialized views and other concepts in oracle.
  • Handled errors using Exception Handling extensively for the ease of debugging and displaying the error messages in the application.
  • Created mappings using Expression, Lookup, Sorter, Normalizer, Aggregator, and Router transformations for populating target table in efficient manner.
  • Created Informatica Power Center Mappings, Mapplets and Transformations using the Designer and developed Informatica sessions as per the business requirement.
  • Used Informatica to extract, transform and load data from different input sources like flat files and XML files and DB.
  • Developed workflows using task manager, workflow designer in workflow manager and monitored the results using workflow monitor.
  • Performed Unit testing of Informatica ETL process to ensure ETL logic.
  • Used Tivoli workflow Scheduler to schedule the job.
  • Involved in performance tuning of SQL Queries, Sources, Targets and sessions.
  • Developed the UNIX shell scripts for the PL/SQL procedures job run.
  • Coordination with business users for user acceptance testing, signoffs and implementation.

Environment: Oracle 12c, Informatica Power Center Designer 9x, LINUX, Control-M, TFS, Tivoli, Shell script.

Confidential, Charlotte Nc

Senior Teradata/Hive Developer

Responsibilities:

  • Designed, developed, tested and supported Extract, Transform and Load (ETL) processes necessary to load and validate data warehouse using UNIX shell scripting.
  • Worked on Big Data Tools such as "Podium", Hive, MapReduce, Python and UNIX Shell Scripting.
  • Responsible for Extraction of massive structured and unstructured data from various source systems.
  • Involved in Transformation of data into Business relevant Key Performance Indicators (KPIs) and Loading Normalized and Aggregated data into Relational Databases such as Teradata, Netezza and Oracle for Consumer Applications and Business Intelligence reports.
  • Ensured high level of coding standards by following complete Software Development Life Cycle (SDLC) models such as Waterfall and Agile Methodologies.
  • Good knowledge on TPT, BTEQ scripts & Utilities like Multi load, Fast load etc.
  • Strong hands on experience on Teradata tools such as BTEQ, FLOAD, MLOAD, TPUMP, FASTEXPORT, TPT.
  • Involved in BTEQ scripts performance tuning for long running queries in production by using explain plains.
  • Worked on scheduling tools like One Automation and Podium.
  • Having good knowledge on Hadoop Eco-systems like HDFS, MapReduce and HIVE
  • Worked on Teradata SQL Assistant querying the source/target tables to validate the BTEQ scripts.
  • Troubleshoot the issues by checking sessions and workflow logs.
  • Wrote Unix Shell scripts to automate workflows.
  • Developed unit testing, systems testing and post-production verification.
  • Involved in writing the test cases and documentation.

Environment: Teradata 15/14.10, Teradata SQL Assistant, Teradata Utilities, TASM, M load, Fast load, T pump, Fast export, Teradata Parallel Transporter (TPT), BTEQ, Teradata SQL Assistant, Oracle Database 12c/Oracle11i, Informatica Power Center9.6/9.1, Windows, UNIX, One Automation, Podium, Linux, Korn shell.

Confidential, Trevose PA

Teradata Developer

Responsibilities:

  • Communicated with business users and analysts about business requirements and analyzed the specifications provided by the clients.
  • Reviewed mapping documents provided by Business Team, implemented business logic embedded in mapping documents into Teradata SQLs and loaded tables needed for Data Validation.
  • Interacted with people from various teams in the Project like Oracle/ETL/Teradata DBAs, Control-M Schedulers, Micro strategy (BI) Reporting etc. to aid in the smooth functioning of the Project flow.
  • Worked on complex queries to map the data as per the requirements.
  • Coded using BTEQ SQL of TERADATA, wrote UNIX scripts to validate, format and execute the SQL’s on UNIX environment.
  • Responsible for back end stored procedures development using PL/SQL predefined procedures.
  • Used Explain Plan and hints to tune the SQL.
  • Used UNIX Shell scripts to deploy the Oracle forms and reports to production servers.
  • Involved in loading the data from flat files to Oracle tables using SQL*Loader and C.
  • Involved in creating user documentation and providing End user training.
  • Involved in the code reviews conducted by the teams.
  • Worked on deploying solutions using version control systems like Subversion.
  • Created UNIX Scripts for triggering the Stored Procedures and Macro’s.
  • Created test cases and performed unit testing for the SQL views and documented the Unit testing results and reviewed the Integration test cases.
  • Populated or refreshed Teradata tables using Fast load, Multi load & Fast export utilities for user Acceptance testing.
  • Production Implementation and Post Production Support.
  • Worked with Scheduling team on finding the feasible time to run the jobs.
  • Created views based on user and/or reporting requirements.
  • Performance tuning the long running queries.
  • Reduced Teradata space used by optimizing tables - adding compression where appropriate and ensuring optimum column definitions.

Environment: TeradataV14, BTEQ, Teradata SQL Assistant, Multiload, fastload, fastexport, SQL, UNIX, Windows-XP,Linux, Control-m Scheduling Tool.

Confidential

Teradata Developer / ETL Informatica Power Center Developer

Responsibilities:

  • Created the Design for Extraction process from legacy systems using combined techniques of Data Replication and Change Data Capture.
  • Completed the Gap Analysis which includes identifying the gaps between the downstream partner requests to the source system files and to fill the gaps either by rejecting the downstream partner's requests or requesting additional files from the source system.
  • Extensively used Fast export to export data out of Teradata tables.
  • Created BTEQ scripts to invoke various load utilities, transform the data and query against Teradata database.
  • Extensively used SQL Analyzer and wrote complex SQL Queries using joins, sub queries and correlated sub queries.
  • Worked extensively with Teradata utilities - Fast load, Multi load, Tpump, Teradata Parallel Transporter (TPT) to load huge amounts of data from flat files into Teradata database.
  • Extensively used Fast export to export data out of Teradata tables.
  • Created BTEQ scripts to invoke various load utilities, transform the data and query against Teradata database.
  • Created proper PI taking into consideration of both planned access of data and even distribution of data across all the available AMPS.
  • Expertise in creating databases, users, tables, triggers, macros, views, stored procedures, functions, Packages, joins and hash indexes in Teradata database.
  • Hands on experience using query tools like TOAD, SQL, PLSQL, Teradata SQL Assistant and Query man.
  • Involved in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics and SQL Trace both in Teradata as well as Oracle.
  • Excellent Experience with different indexes (PI, SI, JI, AJIs, PPI (MLPPI, SLPPI)) and Collect Statistics.
  • Wrote Teradata Macros and used various Teradata analytic functions.
  • Extensively used SQL Analyzer and wrote complex SQL Queries using joins, sub queries and correlated sub queries.
  • Created highly optimized ETL processes to move the data from legacy systems, DB2 and flat files into Oracle database.

Environment: InformaticaPowerCenter9.1, Teradata12, Oracle11i, DB29.1, Fastload, Ingest, Multiload, Tpump, Fast export, Teradata Parallel Transporter (TPT),BTEQ, Teradata SQL Assistant, Web services, BusinessObjectsXIR2, Linux, SQL, PL/SQL, XML, SQL Loader, TOAD, Tivoli scheduler, control-M, UC4, Toad 9.5, Korn shell, Erwin.

Confidential

Senior ETL/Teradata Developer

Responsibilities:

  • Conducted source System Analysis and developed ETL design document to meet business requirements.
  • Developed Informatica Mappings and Workflows to extract data from PeopleSoft, Oracle, CSV files to load into Teradata staging area using Fast Load/Tpump utilities.
  • Developed ETLs to load data from source to 3NF, stage to 3NF and Stage area to Work, work to 3NF using Informatica Push Down optimization technique to utilize Database processing power.
  • Designed and developed custom Data Quality audits to identify and report the data mismatch between source and target systems and alert Operations Team.
  • Tuned Teradata SQL queries and resolved performance issues due to Data Skew and Spool space issues.
  • Exporting/Importing the 11G Databases through export/import data pumps.
  • Developed, produced and maintained structural design of databases based upon logical data models and business requirements.
  • Coordinated and aligned with application team and ETL team.
  • Create, edit and delete schema objects. (Tables, Indexes, Sequences, Views, Materialized Views) using advanced plsql concepts.
  • Extensively used Oracle Enterprise Manager to monitor the multiple instances, SQL loader for loading data to databases, export/import utilities for data transfer between schemas/databases.
  • Developed Flat files from Teradata using fast export, BTEQ to disseminate to downstream dependent systems.
  • Supported System Integration and User acceptance tests to obtain sign off.
  • Post go live Production Support and Knowledge Transfer to Production Support team.

Environment: TeradataV2R6/12, Oracle9i/10g, InformaticaPowerCenter8/8.6, LINUX, Business Objects, UNIX, TeradataV2R6/V2R5, Teradata SQL Assistant, ARCHMAIN, MLOAD, BTEQ, Teradata Manager, MainframesDB2, Erwin Designer, UNIX, Windows2000, Control M, Clear Case, Shell scripts

We'd love your feedback!