We provide IT Staff Augmentation Services!

Senior Teradata/informatica Developer Resume

2.00/5 (Submit Your Rating)

Atlanta, GA

SUMMARY:

  • 11 years of experience in IT Industry with focus on analysis, design, development, customization, implementation and production deployment of projects in Insurance and Telecom Industry.
  • 8+ years’ experience as a Teradata developer designing and implementing complex data warehousing system
  • 9+ years of experience in ETL as Informatica Developer in Data Warehouses Projects using Informatica PowerCenter 9.x /8.x, UNIX, SQL, PL/SQL, DB2, Microsoft SQL Server and ORACLE.
  • Expertise includes Data Analysis, Data Modelling, Data Cleansing, Transformation, Integration, Data import, Data export and use of ETL tool including Informatica.
  • Worked extensively on Informatica PowerCenter Designer (Source Analyzer, Target Designer, Mapping Designer, Transformation Developer, and Mapplet), Workflow Manager, Task Developer, Workflow Monitor, Repository Manager and Debugger in Informatica.
  • Extensive experience in integration of various data sources like Oracle, Microsoft SQL Server, DB2, Mainframe and Flat Files.
  • Practice in creating Solution architect document, Technical Design Documents, Test - Strategy, Test cases, and Implementation plan.
  • Experience in resolving on-going maintenance issues and bug fixes, monitoring Informatica sessions as well as performance tuning of mappings and sessions.
  • Excellent skills in retrieving the data by writing simple/complex SQL Queries.
  • Experiences in UNIX shell scripting to support and automate the ETL process. Created scripts for calling pmcmd command, file watcher and file archiving scripts.
  • Experience in working on Job scheduling tools
  • Worked with File Transfer Protocol (FTP) and Secure File Transfer Protocol (SFTP) to pull or send the files from one server to another server.
  • Experience in working with offshore/onsite model.
  • Worked with Business Users directly to analyze the business process and made necessary changes to cater their reporting needs.
  • Experience in debugging, error handling and performance tuning of sources, targets, mappings and sessions with the help of error logs generated by Informatica server.
  • Skilled in implementation of technology based solutions for business problems and to maximize productivity through cost reduction and improved efficiencies.
  • Worked in 24 x 7 production system support, including off hours and weekend 'on call' production support responsibilities.
  • Used all Teradata utilities (Multiload, Fastload, Fastexport, BTEQ, TD Manager, PMON, SQL Assistant, Visual explain,) extensively.
  • Extensive experience in Administration and maintenance of Dev, Stage, prod and standby databases for DSS and Data Warehousing environment
  • Experienced in UNIX Shell scripting as part of file manipulation and text processing.
  • Good experience with Teradata query performance and tuning
  • Monitoring Teradata perm space for availability and monitor Teradata CPU/Load utilization.
  • Experienced in Dynamic workload and Resource Management using viewpoint and TDWM
  • Programming experience in Teradata SQL.
  • Expertise in Application Development with knowledge of full software development life cycle.
  • Proficient in complex Data Warehousing concepts including in Data mart and Data Mining using Dimension Modeling (Star Schema, Snow-Flake Schema Design and Normalized structures (3NF)).
  • Proficient in using Teradata Administrator for managing Databases, Users, Tables, Indexes, statistics, managing permissions (Roles & Profiles), addressing the user issues such as resetting passwords, unlocking user ID’s etc.
  • Hands on working with Teradata Viewpoint as a part of eyes on glass activity monitoring and aborting bad queries, looking for blocked sessions and working with development teams to resolve blocked sessions
  • Good knowledge of Data Warehouse concepts and principles (Kimball/ Inman) - Star Schema, Snowflake, SCD, Surrogate Keys, Normalization/ De-normalization.
  • Analysis of change request
  • Coordinate with cross-functional teams in different locations for quality data and analysis

TECHNICAL SKILLS:

Tools &Utilities: Teradata 15, viewpoint, Teradata Administrator, Teradata SQL Assistant, BTEQ, MLOAD, TPT, PMON, Teradata Manager, visual explain plan, Teradata statistics wizard informatica 9, Erwin

Databases: Teradata, MS SQL Server 2005, oracle 9

Languages: C, c++, SQL, Unix Shell scripting, Bteq

Technologies: Data Warehousing

Operating Systems: Windows 95/98/2000/2003/ XP/Vista/7, Sun Solaris 5.6, UNIX

PROFESSIONAL EXPERIENCE:

Confidential, ATLANTA, GA

Senior Teradata/Informatica Developer

Responsibilities:

  • Involved in the requirements gathering sessions in Releases/Projects for ETL requirements.
  • Produce high level and low level design documented for the various change requests and service requests by translating from the Functional Solution Requirement documents and as well prepare the estimation for the requirements.
  • Interacted actively with Business Analysts and Data Modelers on Mapping documents and Design process for various Sources and Targets in corresponds to project coordination.
  • Developed several complex mappings in Informatica with a variety of PowerCenter transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using Informatica PowerCenter.
  • Worked extensively most of the transformations such as the Source qualifier, Expression, Router, Joiner, Sorter, Filter, Rank, Sequence Generator, Aggregator and Lookup as per the business requirement for populating target files in efficient manner.
  • Written complex SQL queries to develop the Interfaces to extract the data in regular intervals to meet the business requirements.
  • Extensively involved in performance tuning of the Informatica ETL mappings by using the caches and overriding the SQL queries and also by using various components like Variables and Parameter files.
  • Analyzed session log files in session failures to resolve errors in mapping or session configuration. Also debugging the mapping to fix the issues.
  • Written various UNIX shell Scripts for scheduling various data cleansing scripts, loading process and automating the execution of maps.
  • Scheduled the tasks using scheduler.
  • Involved in business documents walk through with functional teams for designing application documents, Mapping documents, data flow diagrams.
  • Getting End-user and UAT sign off approval from the Business for implementing in production.
  • Coordinating with the Offshore Team and Client on Releases/Projects/Fast Tracks.
  • Managing the daily execution of workflows and sessions in the production environment.
  • Involved in monitoring jobs and analyzing the reason for the failure discussing with client and design strategy to solve issue.
  • Enhancing and Modifying existing ETL processes or scripts to fix defects at the root cause as enhancement part of project.
  • Work for complex business activities that require fast and accurate delivery which include performing analysis, design, coding and Unit Testing, supporting for implementation.
  • Generating weekly/monthly/yearly statistics reports on production processes.
  • Overseeing the Quality procedures related to the project.
  • Created deployment plan for all the ETL processes.
  • Involved in Unit testing as well as Functional testing, Performance Tuning of Queries.
  • Created staging Tables, Indexes, Sequences, Views and performance tuning like analyzing tables, proper indexing.
  • Identifying and resolving the bottlenecks in source, target, transformations, mappings and sessions to improve performance.
  • Creating macros and stored proc
  • Involved in analysis of end user requirements and business rules based on given documentation and working closely with tech leads and analysts in understanding the current system.
  • Designing flow of data from daily tables to weekly tables.
  • Preparing unit and System integration test cases.
  • Involved in setting up of data for UAT environment and supporting.
  • Conducted several sessions with key business analysts to get the requirements related Reports, KPIs
  • Performed extensive Data Profiling on the source data by loading the data sample into Database using Database Load Utilities.
  • Worked with Data Modelers to create a star schema model for the above subject areas and made sure that all the requirements can be generated from the models created.
  • Created High level and detail design documents for above data mart projects containing process flows, mapping document, initial and delta load strategies, Surrogate key generation, Type I and Type II-dimension loading, Balance and control, Exception processing, Process Dependencies and scheduling.
  • Identified the required dependencies between ETL processes and triggers to schedule the Jobs to populate Data Marts on scheduled basis.
  • Worked with DBA team to ensure implementation of the databases for the physical data models intended for the above data marts. Created proper Teradata Primary Indexes (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS. Considering both the business requirements and factors, created appropriate TeradataNUSI for smooth (fast and easy) access of data.
  • Created proper Teradata Primary Indexes (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS. Considering both the business requirements and factors, created appropriate TeradataNUSI for smooth (fast and easy) access of data.
  • Extensively used the Teradata utilities like BTEQ, FastLoad, MultiLoad, DDL Commands and DML Commands (SQL).
  • Created several BTEQ scripts involving derived tables and volatile/Global temporary tables for adhoc purpose to extract data for several business users on scheduled basis.
  • Involved in the collection of statistics on important tables to have better plan from Teradata Optimizer.
  • Did the performance tuning of user queries by analyzing the explain plans, recreating the user driver tables by right primary Index, scheduled collection of statistics, secondary or various join indexes.
  • Implemented data level, object level and package level security in framework manager to achieve highly complex security standards.
  • Created Master Detail reports, drill through and custom prompting reports, and Scheduled reports for efficient resource utilization.
  • HLD and LLD designof the system, development and debugging of the Informatica mappings
  • Involved in Functional requirement gathering
  • Defined theschema, staging tables, and landing table, configuring base object (customer, address, and Organization), foreign-key relationships, query groups and queries
  • Created the mappings using transformations such as theSource qualifier, Aggregator, Expression, Lookup, Router, Filter, and Update Strategy
  • Conduct sessions with business and development stakeholders on data models and data strategies
  • Actively involved in Post production implementation support responsibility
  • Prepared customized Database objects like PL/SQL Procedures, Functions, and Packages
  • CreatedUNIX Scriptsto Manipulate and Load the data
  • Designed andimplemented stored procedures and triggers for automating tasks in SQL
  • Created complex mappings in Power Center Designer usingAggregate, Expression, Filter, Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Unconnected lookup transformations.
  • Responsible for Technical implementation of the change request solution
  • Supportingenterprise data warehouse and ETL developmentactivities
  • Analysis of the possibilities of solution enhancement
  • Implement theslowly changing dimensions (SCD) type1 and type2to maintain current information and history information in the dimension tables
  • Extracted datafrom flat files and oracle database, applied business logic to load them in the central oracle database
  • Customized and developed Database objects like PL/SQL Procedures, Functions, Packages, and Views Involved indatabase design/preparing SQL scriptsto support the larger databases
  • Reviewed and analyzedFunctional requirements, Mapping documents
  • Analyzed Requests for Changes and new Business requirements
  • Coordinate with Configuration management team in code deployments.
  • Analyzebusiness requirements, designsandwrite technical specificationsto design and redesign solutions. involved inAnalyzing, buildingTeradata EDWusingTeradata ETL utilitiesandInformatica.
  • Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.
  • Created tables, views, Macros in Teradata, according to the requirements.
  • Created, optimized, reviewed, andexecutedTeradataSQL test queriesto validate transformation rules used in source to target mappings/source views, and to verify data in target tables
  • Performed tuning and optimization of complexSQL queriesusingTeradataExplain.
  • Responsible forCollect Staticson FACT tables.
  • Defining, designing, integrating and re-engineering the Enterprise Data warehouse and Data Mart in different environments like Teradata, Oracle with multiple Terabytes of size and various levels of complexity.
  • Involved in design, enhancement, and development of applications for OLTP, OLAP, and DSS using dimension modelling with Teradata and Oracle
  • Advanced SQL skills, including use of derived tables, unions, multi-table inner/outer joins.
  • Working experience in implementing large Teradata DBMSs
  • Experienced in buildingData marts, data structures, data storages, data warehouses, data archives, and data analysis.
  • Analyze and recommend solutions for data issues
  • Interact with business to collect critical business metrics and provide solution to certify data for business use
  • Creates, validates and updates thedata dictionary and analyzing documentationto make sure that the information captured is correct
  • Architecture and design support to provide solution for business initiated requests/ projects
  • WritingTeradata sql queriesto join or any modifications in the table
  • ProposeArchitectural design changesto improve data warehouse performance
  • Visualize a data architecture designfrom high level to low level, and design performance objects for each level
  • Troubleshooting database issues related to performance, queries, stored procedure
  • Fine-tune the existing scripts and process to achieveincreased performance and reduced load times for faster user query performance
  • Architecture and design supportto provide quality controls on critical business data sets
  • Perform analysis on business requirements,KPIand provide solution tocertify datafor business use
  • Created ofBTEQ, Fast export, MultiLoad, TPump, Fast loadscripts for extracting data from various production systems

Confidential, DALLAS, TX

Teradata/Informatica Developer

Responsibilities:

  • Worked closely with the Business Analysts and Data Analysts to understand business requirements and created Functional and ETL specification documents.
  • Analyzed the Business Requirement Documents (BRD) and laid out the steps for the data extraction, business logic implementation & loading into targets.
  • Actively participated in building the collaboration with client and anticipated the team in their tasks proactively.
  • Managing calls every day including work status and weekly team status.
  • Developed source-target mappings and documented the same.
  • Involved in all phases of ETL i.e. from Source to target.
  • Extensively worked with Informatica - Source Analyzer, Warehouse Designer, Transformation developer, Mapplet Designer, Mapping Designer, Workflow Manager, and Workflow Monitor to develop various complex mappings, mapplets, reusable transformations, session tasks and Workflows.
  • Extensively worked on data Extraction, Transformation and Loading from source to target system using Informatica.
  • Involved in gathering information for various scenario for Change Data Capture.
  • Actively involved in designing different types of mappings for to have effective data process to achieve high performance.
  • Involved in system testing, performance and System Integration testing stages of the project.
  • Took care of quality procedures for analysis, program specifications, exhaustive test plans, defect tracking, change procedures etc.
  • Involving in the design discussions of the next phases.
  • Managed and also actively worked on QA and defect fixing for the entire set of ETL jobs.
  • Created High level and detail design documents for above data mart projects containing process flows, mapping document, initial and delta load strategies, Surrogate key generation, Type I and Type II-dimension loading, Balance and control, Exception processing, Process Dependencies and scheduling.
  • Identified the required dependencies between ETL processes and triggers to schedule the Jobs to populate Data Marts on scheduled basis.
  • Worked with DBA team to ensure implementation of the databases for the physical data models intended for the above data marts. Created proper Teradata Primary Indexes (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS. Considering both the business requirements and factors, created appropriate TeradataNUSI for smooth (fast and easy) access of data.
  • Created proper Teradata Primary Indexes (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS. Considering both the business requirements and factors, created appropriate TeradataNUSI for smooth (fast and easy) access of data.
  • Extensively used the Teradata utilities like BTEQ, FastLoad, MultiLoad, DDL Commands and DML Commands (SQL).
  • Created several BTEQ scripts involving derived tables and volatile/Global temporary tables for adhoc purpose to extract data for several business users on scheduled basis.
  • Involved in the collection of statistics on important tables to have better plan from Teradata Optimizer.
  • Did the performance tuning of user queries by analyzing the explain plans, recreating the user driver tables by right primary Index, scheduled collection of statistics, secondary or various join indexes.

Confidential

Teradata / Informatica Developer

Responsibilities:

  • Interacted with business analyst and gathered requirements based on requirement.
  • Produce high level and low level design documented for the various change requests and service requests by translating from the Functional Solution Requirement documents.
  • Designed and developed ETL Mappings using Informatica to extract data from flat files and Oracle, and to load the data into the target database.
  • Successfully created complex Informatica mappings to filter the data and load it, which include extensive use of transformations like Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer and Sequence generator.
  • Extensively used various transformations like Source Qualifier, Joiner, Aggregation, Connected and unconnected lookup, Rank and Filter.
  • Extracted data from various sources like SQL server, Oracle, and relational databases.
  • Integrated workflow and created Autosys jobs for scheduling.
  • Administrating User Privileges, Groups and Folders which includes creation, update and deletion.
  • Performed unit testing on the given tasks after performing the appropriate mappings.
  • Used PMCMD/PMREP commands for running Informatica from backend.
  • Developed Batch Jobs using UNIX Shell scripts to automate the process of loading, pushing and pulling data from and to different servers.
  • Performed unit testing for developed objects and Involved in resolving the defects rose during Integration Testing, System Testing, UAT and Post Production.
  • Involved in code migration activities from one environment to another environment.
  • Took care of quality procedures for analysis, program specifications, exhaustive test plans, defect tracking, change procedures etc.
  • Evaluate and implement as appropriate Teradata features to improve performance and reduce Teradata resource utilization.
  • Perform query tuning to reduce Teradata resource consumption and shorten response time, including partitioning, index creation and alteration and also performance tuning that is specific to applications - solve spool out and long running query issues
  • Perform data-backup for tables during truncation and then reload the data back to base table after obtaining business approval user access requests to databases for duplication and relevant environments; report findings to business and IBM Teradata System DBAs
  • Migration
  • Managing database space, allocating new space to database, moving space between databases as need basis.
  • Creating roles and profiles as needed basis. Granting privileges to roles, adding users to roles based on requirements.
  • Extensively worked with DBQL data to identify high usage tables and columns.
  • Use of Teradata Manager, BTEQ, FASTLOAD, MULTILOAD, TPUMP, SQL and TASM for workload management.
  • Monitoring bad queries, aborting bad queries using PMON, looking for blocked sessions and working with development teams to resolve blocked sessions.
  • Perform workload management using various tools like Teradata Manager, Fast Load, Multi Load, TPUMP, TPT, SQL Assistant.
  • Used External Loaders like Multi Load, T Pump and Fast Load to load data into Teradata Database. Involved in analysis, development, testing, implementation and deployment.
  • Monitoring the load on different database servers.
  • Investigation and solution of different requests/issues coming through Problem Management Tool (Remedy) from end users.
  • Provide support to Daily, Weekly and Monthly ETL activities during the ETL cycle

Confidential

Informatica/ETL Developer

Responsibilities:

  • Responsible in gathering and analyzing requirements of ETL logic using ETL Specifications.
  • Interacted with Business Users in the design of technical specification documents.
  • Design Data Flow Diagrams (DFD’s) and ETL Technical Specs or lower level design documents for all the source applications.
  • Worked with source databases like Oracle, VSAM and Flat Files.
  • Extensively worked with various Active transformations like Filter, Sorter, Aggregator, Router, Lookup and Joiner transformations.
  • Extensively worked with various Passive transformations like Expression, Sequence Generator, Mapplet Input and Mapplet Output transformations.
  • Created complex mappings using Unconnected and Connected lookup Transformations.
  • Worked extensively with update strategy transformation for implementing inserts and updates.
  • Implemented Slowly changing dimension Type 1 and Type 2 for change data capture.
  • Worked with various look up cache like Dynamic Cache, Static Cache, Persistent Cache, Re Cache from database and Shared Cache.
  • Worked with various Informatica Power Center objects like Mappings, transformations, Mapplet, Workflows and Session Tasks.
  • Extensively used debugger to test the logic implemented in the mappings.
  • Performed error handing using session logs.
  • Written various UNIX shell Scripts for scheduling various data cleansing scripts, loading process and automating the execution of maps.
  • Performed unit testing for developed objects and Involved in resolving the defects rose during Integration Testing, System Testing, UAT and Post Production.
  • Involved in code migration activities from one environment to another environment.
  • Modifying existing ETL processes or scripts to fix defects at the root cause as enhancement part of project.
  • Worked on SQL queries and tuned them to improve performance.
  • Responsible for the performance tuning of the ETL process at source level, target level, mapping level and session level.
  • Monitored workflows and session using Power Center workflows monitor.

Confidential

Informatica Developer

Responsibilities:

  • Utilize in-depth knowledge of functional and Technical experience and business skills to deliver solutions to customer.
  • Deliver new and complex high quality solutions to clients in response to varying business requirements.
  • Responsible for effective communication between the project team and the customer. Provide day to day direction to the project team and regular project status to the customer.
  • Responsible for planning the project execution from offshore and coordination of the design, development, testing and implementation activities from offshore.
  • Translate business requirements into Functional Requirements Document and to Detailed Design Documents.
  • Lead the efforts including programming and testing that conclude in client acceptance of the results.
  • Proficiently managed for ETL development using the Informatica power center module.
  • Worked with Source Analyzer, Data Mappings, Transformations, Informatica Repository Manager, Workflow Manager and Monitor.
  • Used most of the transformations such as the Source qualifier, Router, Filter, Sequence Generator, Stored Procedure and Expression as per the business requirement.

Confidential

Informatica Developer

Responsibilities:

  • Interacted with business analyst and gathered requirements based on requirement.
  • Produce high level and low level design documented for the various change requests and service requests by translating from the Functional Solution Requirement documents.
  • Designed and developed ETL Mappings using Informatica to extract data from flat files and Oracle, and to load the data into the target database.
  • Successfully created complex Informatica mappings to filter the data and load it, which include extensive use of transformations like Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer and Sequence generator.
  • Extensively used various transformations like Source Qualifier, Joiner, Aggregation, Connected and unconnected lookup, Rank and Filter.
  • Extracted data from various sources like SQL server, Oracle, and relational databases.
  • Integrated workflow and created Autosys jobs for scheduling.
  • Administrating User Privileges, Groups and Folders which includes creation, update and deletion.
  • Performed unit testing on the given tasks after performing the appropriate mappings.
  • Used PMCMD/PMREP commands for running Informatica from backend.
  • Developed Batch Jobs using UNIX Shell scripts to automate the process of loading, pushing and pulling data from and to different servers.
  • Performed unit testing for developed objects and Involved in resolving the defects rose during Integration Testing, System Testing, UAT and Post Production.
  • Involved in code migration activities from one environment to another environment.
  • Took care of quality procedures for analysis, program specifications, exhaustive test plans, defect tracking, change procedures etc.

We'd love your feedback!