We provide IT Staff Augmentation Services!

Informatica Developer Resume

2.00/5 (Submit Your Rating)

Texas, DallaS

SUMMARY:

  • Around 10+ years of IT experience in Programming, design, and development of Data Warehousing using Teradata, Informatica and UNIX environments for medium to large enterprise data warehouses.
  • 2+ years of experience in all phases of Hadoop Technologies. Excellent understanding in depth knowledge of Hadoop architecture and various components such as HDFS, Map Reduce programming and other Ecosystem components.
  • Experienced on major Hadoop ecosystem projects such as PIG, HIVE, SQOOP and HBASE.
  • Good understanding on Spark architecture and its components.
  • Good understanding of various phases of project life cycle.
  • Good knowledge on Data Warehousing concepts.
  • Experience in using load Teradata utilities BTEQ, FASTLOAD, MULTILOAD and TPT.
  • Good knowledge on Teradata Architecture.
  • Good understanding on Teradata Performance tuning measures including collecting Stats, Primary Index, Primary Partitioned Index, Join Indexes, Secondary Indexes, analyzing explain plans.
  • Strong DDL and DML writing skills as well as capable to write complex SQLs for data analysis.
  • Have good experience in optimization, performance enhancement of queries and techniques for enhancing the load performance in various schemas across databases.
  • Has been instrumental in analysis of long running queries and identifying root cause and saving system resources effectively.
  • With the in - depth expertise in the Teradata cost based query optimizer, identified potential bottlenecks with queries from the aspects of query writing, skewed redistributions, join order, optimizer statistics, physical design considerations (PI and USI and NUSI and JI etc.) etc.
  • Through knowledge of working with DBQL tables and suggesting solutions with better optimised queries
  • Experience in Teradata viewpoint to monitor the sessions and identify the high cpu utilization and skewed queries and spool usage sessions.
  • Have good Experience in ETL Tool Informatica and Unix Scripting.
  • Have good Experience in Tivoli scheduler tool.
  • Excellent problem solving skills with strong technical background and good interpersonal skills.

TECHNICAL SKILLS:

Programming Language: Java basics

Hadoop ECO Systems: Hadoop, MapReduce, HDFS, HBase, Hive, Pig, Knowledge on Sparks

Operating System: Suse Enterprise Linux 9/10, Windows Xp, Unix (MP-RAS).

Version Control Tools: CVS,SVN

Linux Tools: Shell Scripting, Vi Editor.

ETL tool: Informatica, Teradata load Utilities

Teradata Specific Database version: 14.0, 13.10, 13.0,12.0 & Oracle

Load/Unload Utilities: Bteq,FastLoad, MultiLoad, TPump, FastExport, TPT

System Monitoring/Analysis: Teradata ViewpointQuery analysis tools, Teradata System Emulation Tool (TSET)

System Management Tools: DBScontrol

PROJECT PROFILE:

Confidential, Texas, Dallas

Informatica Developer

Operating Systems: Unix

Programming Languages: Teradata, Informatica .

Responsibilities:

  • Worked extensively on BTEQ, UNIX Shell scripting.
  • Resolved issues related to semantic layer or reporting layer.
  • Worked on different subject areas like Campaign, Digital Marketing, Promotion and Item.
  • Written complex SQLs using joins, sub queries and correlated sub queries. Expertise in SQL Queries for cross verification of data.
  • Developed the Teradata Macros, Stored Procedures to load data into Work tables and then move data from work table into Base tables.
  • Reviewed the SQL for missing joins & join constraints, data format issues, miss-matched aliases, casting errors.
  • Responsible for Design, Data Mapping Analysis, Mapping rules .
  • Responsible for Development, Coding & testing.
  • Responsible for Implementation & Post Implementation support.
  • Extensively used loader utilities to load flat files into Teradata RDBMS.
  • Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
  • Created TPT to transfer the data SQL Server to Teradata.
  • Collected statistics periodically on tables to improve system performance.
  • Performed tuning and optimization of application SQL using Query analyzing tools.

Confidential, Texas

Informatica Developer

Operating Systems: Unix

Programming Languages: Hive, Pig, Unix, Teradata, Informatica .

Responsibilities:

  • Involved in data ingestion into HDFS using Sqoop from variety of sources like Oracle using the connectors like JDBC and import parameters.
  • Designed and developed the scripts to load the data into Hive.
  • Performed various performance optimizations like using distributed cache for small datasets, Partitions, Bucketing in Hive.
  • Implemented partitioning, dynamic partitions and buckets in HIVE and analyzed the partitioned and bucketed data to compute various metrics for reporting.
  • Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way. Involved in using HCATALOG to access Hive table metadata from Map Reduce.
  • Responsible for migrating tables from traditional RDBMS into Hive tables using Sqoop.

Confidential

Informatica Developer

Programming Languages: Hive,Sqoop,Teradata,Pig

Responsibilities:

  • Analyzing the requirements and estimating the Level of effort and providing the timeline to business and giving updates every week. And achieving the timeline and delivering quality output to Business. And also fixing production issues.
  • Migrated the existing data to Hadoop from RDBMS using Sqoop for processing the data.
  • Used Hive data warehouse tool to analyze the data in HDFS and developed Hive quiries.
  • Developed Simple to complex Map/reduce Jobs using Hive.
  • Created partitioned tables in Hive
  • Responsible for creating Hive tables, loading the structured data resulted from MapReduce jobs into the tables and writing Hive Queries to further analyze the data.
  • Worked on setting up Pig, Hive on multiple nodes and developed using Pig, Hive and MapReduce.

Confidential

Informatica Developer

Operating Systems: Unix

Programming Languages: Teradata, Hive,Pig,Sqoop.

Responsibilities:

  • Understand the requirement document.
  • Communicates with Client or Business Owner to ensure that requirements are clearly understood.
  • Loaded data into hadoop ecosystem from external system.
  • Exporting data into HDFS and Hive using SQOOP.
  • Involved in creating Hive tables, loading with data and writing hive queries that will run internally in map reduce way.
  • Developed PIG Latin scripts for the analysis of semi structured data.
  • Involved in creating Hive Tables, loading data and writing Hive queries.

Confidential

Informatica Developer

Operating Systems: Unix

Programming Languages: Teradata, Tableau (Basics)

Responsibilities:

  • Understand the requirement document.
  • Communicates with Client or Business Owner to ensure that requirements are clearly understood.
  • Worked on creating and managing the partitions.
  • Data Analysis and issue identification
  • Troubleshooting database issues related to performance, queries, stored procedure.
  • Resolving the Production Failures.
  • Extract the data from production server to development server whenever required using the TPT scripts.

Confidential

Informatica Developer

Operating Systems: Windows 7, Unix

Programming Languages: Teradata & its load utilities, Unix

Responsibilities:

  • Prepare test cases, test plan and strategy to test the code that will be developed
  • Work with the source system analysts to understand the windows available for data extraction
  • Increased performance by 35-40% in some situations. With the in-depth expertise in the Teradata cost based query optimizer, identified potential bottlenecks with queries from the aspects of query writing, skewed redistributions, join order, optimizer statistics, physical design considerations (PI and USI and NUSI and JI etc) etc.

Confidential

Informatica Developer

Operating Systems: Windows 7, Unix

Programming Languages: Teradata & its load utilities, Sybase, Unix

Responsibilities:

  • Understand the requirement document. He communicates with Client or Business Owner to ensure that requirements are clearly understood.
  • He also has to communicate & coordinate with other teams such as source data provider, reporting team etc.
  • Prepare required design documents according as per the defined template to Ensure that requirement is understood clearly documented and verified by the Client or Business owner.
  • Prepare test cases, test plan and strategy to test the code that will be developed.
  • Ensure that required test environment is ready or has to specify the requirements for setting up a test environment.
  • Understand the defined coding & quality standards to be followed.
  • This includes naming convention to be followed for variables, tables, temporary tables etc. in stored procedures, Naming convention for jobs to be created on client framework.
  • Standards also include folder structure to be created in code versioning and repository tool to maintain code versions. Follow the defined SDLC process during code development or enhancement.
  • Also, He has to communicate with other teams & Client regarding the time lines for each Phase in the SDLC process.
  • Performance tuning, including collecting statistics, analyzing explains & determining which tables needed statistics.
  • Increased performance by 35-40% in some situations. With the in-depth expertise in the Teradata cost based query optimizer, identified potential bottlenecks with queries from the aspects of query writing, skewed redistributions, join order, optimizer statistics, physical design considerations etc.
  • He has worked on creating and managing the partitions.
  • Deliver new and complex high quality solutions to clients in response to varying business requirements.
  • Loading data from various data sources and legacy systems into Teradata production and development warehouse using BTEQ, FASTEXPORT, MULTI LOAD and FASTLOAD.

Confidential

Informatica Developer

Operating Systems: WINDOWS

Programming Languages: Teradata, Sybase, Unix.

Responsibilities:

  • He is playing a significant role in leading and guiding the team of 5 members as well as working on the complex issues in the project.
  • Scope of his responsibilities includes discussing with Business partners to understand the requirements, making required code changes as per the requirement and testing the code using given source files.
  • Once the code is tested, it will moved to the repository for version Maintenance and it will be moved to production through defined code deployment process.
  • Include interacting with different teams such as Source team, Reporting team and Business owner to ensure smooth flow of the process in implementing the enhancements.

Confidential

Informatica Developer

Operating Systems: WINDOWS

Programming Languages: Teradata, Unix, Sybase

Responsibilities:

  • The responsibilities include identifying failures of jobs that belong to the portfolio.
  • Perform a root cause analysis for the failure, applying a fix to avoid such failures. He also has to identify processes that consume high system resources, fine tuning the processes thereby improving the system utilization.
  • Has to develop jobs by understanding the existing Abinitio logic and implementing the logic in the customized Metadata framework.
  • Parallel testing Between Teradata & Sybase.
  • Converting Sybase Procedures to Teradata Procedures.

Confidential

Informatica Developer

Programming Languages: Teradata, Unix, Sybase

Responsibilities:

  • Has involved in stored procedure conversion, history data migration of tables & stored procedure testing
  • Handled a team and effectively carried out node configuration, job scheduling of the nodes associated to the stored procedures in the Event engine
  • Resolved complex stored procedure issues like Spool space issue, Target row updated by multiple source rows error etc and also guided the team for debugging errors so that stored procedures can be deployed on time
  • Gained thorough exposure to MDF (file load) framework which was suggested as a replacement in Teradata for all the direct file load Sybase stored procedures by the client.

We'd love your feedback!