Sr. Etl Developer Resume
Sunnyvale, CA
SUMMARY:
- Around 8+ years of IT experience with expertise in analysis, design, development and implementation of Data warehousing using ETL,BI and Tidal tools with Oracle, DB2, MS SQL server and Teradata databases on windows/Unix platforms.
- Expert level experience in Data Integration and Data Warehousing using ETL tool Informatica PowerCenter/7.x/8.x/9.x. Knowledge of Informatica tools Power Exchange, PowerCenter, DataAnalyzer and MetaData Manager.
- Expert - level mastery in designing and developing complex mappings to extract data from diverse sources including flat files, RDBMS tables, legacy system files, XMLfiles, Applications, COBOL Sources and Teradata.
- Proficient with the concepts of Data Warehousing, Data Marts, ER Modeling, Dimensional Modeling, Fact and Dimensional Tables with data modeling tools Erwin and ER Studio.
- Technical and Functional experience in Data warehouse implementations ETL methodology using Informatica Power Center 9.0.1/8.6/8.1/7.1
- Strong experience in Data Cleansing, Data profiling and Data Analysis. Involved building data warehouse using Oracle, Teradata and Netezza Teradata, Oracle 10g/9i/8i in Finance, Retail, etc.
- Proficient in implementing Complex business rules by creating re-usable transformations, workflows/worklets and Mappings/Mapplets.
- Experience in Performance Tuning of mappings and sessions.
- Thorough knowledge of database management concepts like conceptual, logical and physical data modeling and data definition, population and manipulation. Expertise in logical and physical database design and Data Modeling using data modeling tools like Erwin.
- Involved in Data Migration projects from DB2 and Oracle to Teradata. Created Automated scripts to do the migration using Unix shell scripting, Oracle/TD SQL, TD Macros and Procedures.
- Strong expertise in Relational data base systems like Oracle, SQL Server, TeraData, MS Access, DB2 design and database development using SQL, PL/SQL, SQL PLUS, TOAD, SQL-LOADER. Highly proficient in writing, testing and implementation of triggers, stored procedures, functions, packages, Cursors using PL/SQL.
- Experience in Task Automation using Power Shell Scripts, Job scheduling (Tidal) and Communicating with Server using pmcmd.
- Strong knowledge in Database administration and Data Warehousing concepts.
- Experience in Informatica Power Center Client tools and Server Administration (Designer, Workflow Manager, Workflow Monitor, Repository Manager)
- Worked extensively on different stages like Transformer, Aggregator, Merge, Join, Copy, Lookup, Lookup File set, Filter, Change Capture, Oracle Enterprise, Column Generator, Dataset stagesfor Data cleansing and Data Transformation.
- Implemented Slowly Changing dimensions Type 1 and Type 2 methodology
- Extensive experience in Teradata MPP platform.
- Proven record in both technical and functional applications of RDBMS, Data Mapping, Data management and Data transportation.
- Did the performance tuning of user queries by analyzing the explain plans, recreating the user driver tables by right primary Index, scheduled collection of statistics, secondary or various join indexes.
- Hands on experience on Tableau Desktop versions (8.2/ 9.2.7), Tableau Reader and Tableau Server.
- Developed complex calculated fields for business logic, sets, Groups and parameters, field actions to include various filtering capabilities for the dashboards, and also drill down and drop down features were provided for the detailed reports.
- Experience in configuring reports from different data sources using data blending.
- Building, publishing customized interactive reports and dashboards report scheduling using Tableau server. Creating New Schedule's and monitoring the task's daily on the Tableau server.
- Expertise in ETL tools using Talend Studio and SQL * Loader for data migration to Enterprise Data Warehouse.
- Used Talend Administrator Center (TAC), UC4 (Automic) for automating daily ETL talend daily/weekly refresh jobs in production environment.
- Gathered user requirements, analyzed and designed solutions based on the requirements.
- Strong problem solving, analytical, interpersonal skills, communication skills and have the ability to work both independently and as a team.
- Self-motivated person with excellent communication and interpersonal skills.
- Excellent analytical, problem solving, and communication skills, with ability to interact with individuals at all levels.
TECHNICAL SKILLS
Data Warehousing/ ETL: Informatica PowerCenter 9.6/9.1/8.6/8.1 , Informatica PowerMart 9/8.x, (Workflow Manager, Workflow Monitor, Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Repository manager), Metadata, Datamart, OLAP, OLTP, Erwin, Talend 5.6, Tableau(8.2/ 9.2.7)
Dimensional Data Modeling: Dimensional Data Modeling, 3NF Data Modeling, Star Schema Modeling, Snow-Flake Modeling, Physical and Logical Data Modeling, Erwin and Sybase Power Designer.
Databases & Tools: Teradata 12/13/13.10/14 , Oracle 11g/10g/9i/8i/8.x, DB2 UDB 8.5, SQL*Server, MS SQL Server 2005/2000/7.0/6.5 , SQL*Plus, SQL*Loader, TOAD, SQL Assistant, Erwin 7.3/8/9, ER Studio, GreenPlum, Cassandra
Scheduling Tools: Informatica Workflow Manager, Autosys, CTL-M, TAC, UC4/Automic
Programming Languages: Unix Shell Scripting, SQL, PL/SQL, Perl, Teradata Procedure
Environment: UNIX, Win XP/NT 4.0, Sun Solaris 2.6/2.7, HP-UX 10.20/9.0, IBM AIX 4.2/4.3
PROFESSIONAL EXPERIENCE:
Confidential, Sunnyvale CA
Sr. ETL Developer
Responsibilities:
- Performed Query Optimization with the help of explain plans, collect statistics, Primary and Secondary indexes. Used volatile table and derived queries for breaking up complex queries into simpler queries.
- Used several of SQL features such as Group By, Rollup, Rank, Qualify, Case, Regex, Substr, To Number, Union, Subqueries, Exists, Coalesce, Null etc.
- Design, Development and Documentation of the ETL (Extract, Transformation & Load) strategy to populate the Data Warehouse from the various source systems.
- Identified performance issues in existing sources, targets by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
- Development of scripts for loading the data into the base tables in EDW and to load the data from source to staging and staging area to target tables using FastLoad, MultiLoad and BTEQ utilities of Teradata. Writing scripts for data cleansing, data validation, data transformation for the data coming from different source systems.
- Written complex SQLs using joins, sub queries and correlated sub queries. Expertise in SQL Queries for cross verification of data.
- Extensively used UNIX Shell Scripting for writing SQL execution scripts in Data Loading Process.
- Tuning of Teradata SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc.
- Very good understanding of Database Skew, PPI, Join Methods and Join Strategies, Join Indexes including sparse, aggregate and hash.
- Performing data quality & integrity, end to end testing, reverse engineering & document existing ETL program codes.
- Use SQL to query the databases and do as much crunching as possible in Teradata, using very complicated SQL Query optimization (explains plans, collect statistics, data distribution across AMPS, primary and secondary indexes, locking, etc) to achieve better performance.
- Excellent experience in performance tuning and query optimization of the Teradata SQLs.
- Worked on Teradata Temporal features such as period data type, CURRENT, SEQUENCED and NONSEQUENCED to maintain History in DW tables.
- Extracted data from GreenPlum into Teradata and created dashboards for Business using Tableau.
- Used Teracopy to copy the data from different environments and test the data.
- Extracted data in Jsons from Cassandra to HDFS, converted the Jsons, loaded into files and then into tables.
- Worked on mapping needed data from JSONS and loaded into the tables.
- Created Jobs, Configuring Workflows in UC4 (Automic) to extract the data from the various sources, transforming data and load into staging and Target tables in EDW.
- Running and Monitoring daily scheduled jobs by using UC4 (Automic) for supporting EDW(Enterprise Data Warehouse) loads for History as well as incremental data.
- Design, Development and Documentation of the ETL (Extract, Transformation & Load) strategy to populate the Data Warehouse from the various source systems.
- Developing as well as modifying existing jobs for enhancements of new business requirements to load into staging tables and then to target tables in EDW.
- Used GitHub to move the files/scripts from DEV/QA to Production Environment.
- Writing Yaml scripts to extract data from various ETL sources and automate/monitor daily/weekly/monthly scheduled jobs and Workflows.
- Implemented Teradata MERGE statements in order to update huge tables by improving the performance of the application.
- Wrote shell scripts and send data file feeds daily to the Business based on the requirements.
- Building, publishing customized interactive reports and dashboards, report scheduling using Tableau server.
- Designed, developed, tested, and maintained Tableau functional reportsbased on user requirements.
- Created quick/context/global/action filters, parameters and calculated fields for preparing dashboards and worksheets in Tableau to handle views more efficiently and also scheduledextract refreshes for weekly and monthly reports.
- Involved in Tableau administration tasks such as publishing workbooks, setting permissions, providing access to the users and adding them to the specific groups and Projects, Scheduling Refreshes in Tableau Server.
- Developed Tableau visualizations and dashboards using Tableau Desktop and developed Tableau workbooks from multiple data sources using Data Blending.
- Designed Fact tables and Dimension tables for star schemas and snowflake schemas using ERwin tool.
- Developed Tableau workbooks to perform year over year, quarter over quarter, YTD, QTD and MTD type of analysis.
- Built dashboards for measures with forecast, trend line and reference lines.
- Publish the developed dashboard, reports on theTableauServer so that the end users having access to the server can view the data.
- Consolidated multiple Tableau workbooks into single workbook.
- Created and enhanced ETL Talend jobs for Data validation, Extracts and Full Refreshes.
- Understanding the business requirements, developing design specifications for applications using Talend Enterprise Data Integration.
- Used Talend Administrator Center (TAC) for automating daily ETL talend daily/weekly refresh jobs in production environment.
Environment: Teradata 15/14, Oracle 11g/10g, Tableau(8.2/ 9.2.7), Talend 5.6, Talend Administrator Center (TAC), UC4 (Automic) (11.2), Cassandra, Greenplum, GitHub, DataStax DevCenter
Confidential, Los Angeles, CA
Sr. Teradata/Sr. Informatica Developer
Responsibilities:
- Involved in gathering business requirements, logical modeling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.
- Extracted datafrom DB2 database on Mainframes andloadedit into SET and MULTISET tables in the Teradata database by using various Teradata load utilities. Transferred large volumes of data using Teradata FastLoad, MultiLoad, and T-Pump.
- Performed Query Optimization with the help of explain plans, collect statistics, Primary and Secondary indexes. Used volatiletable and derivedqueries for breaking up complex queries into simpler queries. Streamlined the Teradata scripts and shell scripts migration process on the UNIX box.
- Designed and developed a number of complex mappings using various transformations like Source Qualifier, Aggregator, Router, Joiner, Union, Expression, Lookup, Filter, Update Strategy, Stored Procedure, Sequence Generator, etc.
- Created reusable sessions, Workflow, Worklet, Assignment, Decision, Event Wait and Raise and Email Task and Scheduled Task based on Client requirement.
- Used Informatica debugger to test the data flow and fix the mappings.
- Involved in creating Unit test plans for and testing the data for various applications
- Developing as well as modifying existing mappings for enhancements of new business requirements mappings to load into staging tables and then to target tables in EDW. Also created mapplets to use them in different mappings.
- Defining the schema, staging tables, and landing zonetables, configuring base objects, foreign-key relationships, complex joins, and building efficient views.
- Involved in gathering business requirements, logical modeling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.
- Defining the schema, staging tables, and landing zonetables, configuring base objects, foreign-key relationships, complex joins, and building efficient views.
- Changing the existing Data Models using Erwin for Enhancements to the existing Datawarehouse projects.
- Worked and Implemented Pushdown Optimization (PDO) to optimize performance issues of complex mappings involving numerous transformations and hence degrading the performance of the session.
- Expertise in writing scripts for Data Extraction, Transformation and Loading of data from legacy systems to target data warehouse using BTEQ, FastLoad, MultiLoad, and Tpump.
- Supported various business teams with Data Mining and Reporting by writing complex Teradata SQLs using Basic and Advanced SQL including OLAP functions like Ranking, partitioning and windowing functions.
- Conduct business requirement review sessions with IT and Business teams to understand what the outcome should be based on data elements.
- Development of scripts for loading the data into the base tables in EDW and to load the data from source to staging and staging area to target tables using FastLoad, MultiLoad and BTEQ utilities of Teradata. Writing scripts for data cleansing, data validation, data transformation for the data coming from different source systems.
- Very good understanding of Database Skew, PPI, Join Methods and Join Strategies, Join Indexes including sparse, aggregate and hash.
- Used extensively Teradata Analyst Pack such as Teradata Visual Explain, Teradata Index Wizard and Teradata Statistics Wizard.
- Used extensively Derived Tables, Volatile Table and GTT tables in many of the ETL scripts.
- Tuning of Teradata SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc…
- Performed application level DBA activities creating tables, indexes, monitored and tuned TeradataBETQ scripts using Teradata Visual Explain utility.
- Written complex SQLs using joins, sub queries and correlated sub queries. Expertise in SQL Queries for cross verification of data.
- Developed the Teradata Macros, Stored Procedures to load data into Incremental/Staging tables and then move data from staging into Base tables
- Reviewed the SQL for missing joins & join constraints, data format issues, miss-matched aliases, casting errors.
- Extensively worked on Performance tuning to increase the throughput of the data load (like read the data from flat file & write the data into target flat files to identify the bottlenecks).
- Worked on Error handling and performance tuning in Teradata queries and utilities.
- Writing SQL Scripts to extract the data from Database and for Testing Purposes.
- Interacting with the Source Team and Business to get the Validation of the data.
- Involved in all the enhancements using Informatica, PL/SQL and SQL and deliver that deliverable within the time limit.
- Involved in resolving issue tickets related data issues, ETL issues, performance issues, etc.
- Used in Unix Schell scripting to automate several ETL procresses.
- Created and Scheduled the Autosys process in the production by creating the JIL files.
- Strong expertise in physical modeling with knowledge to use Primary, Secondary, PPI, and Join Indexes.
- Used extensively Derived Tables, Volatile Table and GTT tables in many of the ETL scripts.
- Create new UNIX scripts to automate and to handle different file processing, editing and execution sequences with shell scripting by using basic Unix commands and ‘awk’, ‘sed’ editing languages.
- Created UNIX shell scripts for Informatica post and pre session operations, database administration and day-to-day activities like, monitor network connections and database ping utilities.
- Developed UNIX shell scripts to run batch jobs in Autosys and loads into production.
Environment: Informatica Power Center 9.6/9.1, Teradata 15/14, Control-M, Oracle 11g/10g, Fixed width files, TOAD, SQL Assistant, Unix, Korn Shell Scripting, Autosys r11.1, SAS, Cognos, Flat Files.
Confidential, Charlotte, NC
Sr. Informatica/Sr. Teradata Consultant
Responsibilities:
- Involved in full Software Development Life Cycle (SDLC) - Business Requirements Analysis, preparation of Technical Design documents, Data Analysis, Logical and Physical database design, Coding, Testing, Implementing, and deploying to business users.
- Providing technicalsupport and guidance to the offshore team to address complex business problems.
- Involved in gathering business requirements, logical modeling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.
- Defining the schema, staging tables, and landing zonetables, configuring base objects, foreign-key relationships, complex joins, and building efficient views.
- Expertise in writing scripts for Data Extraction, Transformation and Loading of data from legacy systems to target data warehouse using BTEQ, FastLoad, MultiLoad, and Tpump.
- Performed Query Optimization with the help of explain plans, collect statistics, Primary and Secondary indexes. Used volatiletable and derivedqueries for breaking up complex queries into simpler queries. Streamlined the Teradata scripts and shell scripts migration process on the UNIX box.
- Worked on InformaticaPower Center tools - Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
- Using various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
- Developing as well as modifying existing mappings for enhancements of new business requirements mappings to load into staging tables and then to target tables in EDW. Also created mapplets to use them in different mappings.
- Working on different tasks in Workflows like sessions, events raise, event wait, e-mail, command, worklets and scheduling of the workflow.
- Creating sessions, configuring workflows to extract data from various sources, transforming data, and loading into enterprise data warehouse.
- Running and Monitoring daily scheduled jobs by using Work Load manager for supporting EDW (Enterprise Data Warehouse) loads for History as well as incremental data.
- Design, Development and Documentation of the ETL (Extract, Transformation & Load) strategy to populate the Data Warehouse from the various source systems.
- Prepared data marts on policy data, policy coverage, claims data, client data and risk codes.
- Extensively used Informatica PowerCenter 8.6 to create and manipulate source definitions, target definitions, mappings, mapplets, transformations, re-usable transformations, etc.
- Involved in design and development complex ETL mappings and stored procedures in an optimized manner. Used Power exchange for mainframe sources.
- Involved in loading the data from Source Tables to ODS (Operational Data Source) and XML files using Transformation and Cleansing Logic using Informatica.
- Based on the logic, used various transformation like Source Qualifier, Normalizer, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner, XML, Stored procedure transformations in the mapping.
- Involved in performance tuning of mappings, transformations and (workflow) sessions to optimize session performance.
- Developed Informatica SCD type-I, Type-II and Type III mappings and tuned them for better performance. Extensively used almost all of the transformations of Informatica including complex lookups, Stored Procedures, Update Strategy, mapplets and others.
- Snowflake Schema was mainly used with Geography, Customer, Product, and Time as basic dimensions.
- Creating Test cases for Unit Test, System, Integration Test and UAT to check the data quality.
- Investigating failed jobs and writing SQL to debug data load issues in Production.
- Writing SQL Scripts to extract the data from Database and for Testing Purposes.
- Interacting with the Source Team and Business to get the Validation of the data.
- Involved in Transferring the Processed files from mainframe to target system.
- Supported the code after postproduction deployment.
- Automated the ETL process using UNIX Shell scripting.
- Performed tuning and optimization of complex SQL queries using Teradata Explain and Run stats.
- Created a BTEQ script for pre population of the work tables prior to the main load process.
- Extensively used Derived Tables, Volatile Table and Global Temporary tables in many of the ETL scripts.
- Developed MLOAD scripts to load data from Load Ready Files to Teradata Warehouse.
- Performance Tuning of sources, Targets, mappings and SQL queries in transformations.
- Created Primary Indexes (PI) for both planned access of data and even distribution of data across all the available AMPS.
- Created appropriate TeradataNUSI for smooth (fast and easy) access of data.
- Worked on exporting data to flat files using TeradataFastExport.
- Analyzed the Data Distribution and Reviewed the Index choices.
- In-depth expertise in the Teradatacost based query optimizer, identified potential bottlenecks.
- Worked with PPI Teradatatables and was involved in Teradata specific SQL fine-tuning to increase performance of the overall ETL process.
- Implemented project using Agile software methodologies (scrum).
Environment: Informatica Power Center 9.1/8.6, Teradata 14/13.10, Unix, Oracle 10g, Control-M, Fixed width files, TOAD, SQL Assistant, Unix Shell Scripting, Erwin 7.3, Autosys, ETL/ELT.
Confidential, St. Louise, MO
ETL Developer
Responsibilities:
- Interacted with business team to understand business needs and to gather requirements.
- Prepared requirements document in order to achieve business goals and to meet end user expectations.
- Created Mapping document from Source to stage and Stage to target mapping.
- Involved in creating data models using Erwin.
- Worked with Designer tools like Source Analyzer, Target designer, Mapping designer, Mapplets designer, Transformation Developer.
- Worked Extensively on Teradata SQL Assistant to analyze the existing data and implemented new business rules to handle various source data anomalies.
- Designed Mappings by including the logic of restart.
- Created Source and Target Definitions, Reusable transformations, Mapplets and Worklets.
- Created Mappings and used transformations like Source Qualifier, Filter, Update Strategy, Lookup, Router, Joiner, Normalizer, Aggregator, Sequence Generator and Address validator.
- Involved in tuning the mappings, sessions and the Source Qualifier query.
- Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
- Manage all technical aspects of the ETL mapping process with other team members
- Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
- Performed Unit testing and created Unix Shell Scripts and provided on call support.
- Created sessions and workflows to run with the logic embedded in the mappings
- Actively participated in Scrum Meetings.
- Transformed the logical data model into physical data model on Oracle 11g data warehouse environment to give optimal performance for ETL jobs as well as BI reports.
- Worked on Teradata and oracle 11g and AS 400 databases.
- Develop and execute load scripts using Teradata client utilities MULTILOAD, FASTLOAD and BTEQ.
- Provided the dimensional data model to give optimal performance by changing the Primary Index of tables and applying various performance tuning techniques on Oracle 11g data warehouse environment.
- Created and modified several UNIX shell Scripts according to the changing needs of the project and client requirements.
- Wrote Unix Shell Scripts to process the data received from source system on daily basis.
- Involved in the continuous enhancements and fixing of production problems.
- Created and reviewed scripts to create new tables, views, queries for new enhancement in the applications using TOAD.
- Developed and modified UNIX shell scripts as part of the ETL process.
- Participated in discussions with the team members on Prod issues and resolved.
- Monitoring the Data Quality, Generating weekly/monthly/yearly statistics reports on production processes - success / failure rates for causal analysis as maintenance part and Enhancing exiting production ETL scripts.
- Managed Scheduling of Tasks to run any time without any operator intervention.
- Leveraged workflow manager for session management, database connection management and scheduling of jobs.
- Worked and Implemented Pushdown Optimization (PDO) to optimize performance issues of complex mappings involving numerous transformations and hence degrading the performance of the session.
- Worked with Pre and Post Session SQL commands to drop and recreate the indexes on data warehouse using Source Qualifier Transformation of Informatica Power center.
- Created Unix Shell Scripts to automate sessions and cleansing the source data.
- Implemented pipeline partitioning concepts like Hash-key, Round-Robin, Key-Range, Pass Through techniques in mapping transformations Used Autosys for Scheduling, Created various UNIX Shell Scripts for scheduling various data cleansing scripts and loading process. Maintained the batch processes using Unix Shell Scripts
- Wrote, executed, performance tuned SQL Queries for Data Analysis & Profiling.
- Provided Production Support.
Environment: Informatica Power Center 9.1, Teradata 13/12,Unix, Oracle 10g, SQL Assistant, TOAD, Unix Shell Scripting, Control-M
Confidential, Atlanta, GA
Teradata/Informatica ETL Developer
Responsibilities:
- Analyzing, designing and developing ETL strategies and processes, writing ETL specifications, Informatica development, and administration and mentoring other team members.
- Developed mapping parameters and variables to support SQL override.
- Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, and SQL, Lookup (File and Database) to develop robust mappings in the Informatica Designer.
- Worked on Teradata and its utilities - tpump, fastload through Informatica. Also created complex Teradata Macros
- Worked and Implemented Pushdown Optimization (PDO) to optimize performance issues of complex mappings involving numerous transformations and hence degrading the performance of the session.
- Involved in Performance tuning at source, target, mappings, sessions, and system levels.
- Moving the data from source systems to different schemas based on the dimensions and fact tables by using the slowly changing dimensions type 2 and type 1.
- Mostly worked on Dimensional Data Modelling, Star Schema and Snowflake schema modelling.
- Worked on the various enhancements activities, involved in process improvement.
- Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping designer, Transformation Developer, WorkFlow Manager, Workflow Monitor.
- Worked on Change data Capture (CDC) using CHKSUM to handle any change in the data if there is no flag or date column present to represent the changed row.
- Worked on reusable code known as Tie outs to maintain the data consistency. It compared the source and target after the ETL loading is complete to validate no loss of data during the ETL process.
- Worked on PowerExchange bulk data movement processs by using PowerExchange Change Data Capture (CDC) method, Power Exchange Navigator, Power Exchange Bulk Data movement. PowerExchange CDC can retrieve updates at user-defined intervals or in near real time.
- Worked independently on the critical milestone of the project interfaces by designing a completely parameterized code to be used across the interfaces and delivered them on time inspite of several hurdles like requirement changes, business rules changes, source data issues and complex business functionality.
- Analyzed the source systems to detect the data patterns and designed ETL strategy to process the data.
- Created and modified several UNIX shell Scripts according to the changing needs of the project and client requirements.
- Wrote Unix Shell Scripts to process the data received from source system on daily basis.
- Involved in the continuous enhancements and fixing of production problems.
- Defining the schema, stagingtables, and landingzonetables, configuringbaseobjects, foreign-keyrelationships, complexjoins, and buildingefficientviews.
- Expertise in writing scripts for Data Extraction, Transformation and Loading of data from legacy systems to target data warehouse using BTEQ, FastLoad, MultiLoad, and Tpump.
- Supported the development and production support group in identifying and resolving production issues.
- Was instrumental in understanding the functional specifications and helped the rest of the team to understand the same.
- Involved in All the phases of the development like Analysis, Design, Coding, Unit Testing, System Testing and UAT
- Created and executed test cases, test scripts and test summary reports
- Provided support and training to the team regarding any technical and functional aspects of the project Implemented DBMOVER configuration file, dbmover.cfg, to configure the operation of various Power Exchange tasks as well as their communication with other Power Exchange tasks.
- Applied different types of monitoring for Power exchange processes like dtlcacon, dtllst, oracle scn, lag scripts, Heartbeat table to monitor the lag
- Created and managed different Power exchange directories like condense files directory, check point directory etc.
- Developed wrapper shell scripts for calling Informatica workflows using PMCMD command and Created shell scripts to fine tune the ETL flow of the Informatica workflows.
- Experience in managing different Power exchange directories likecondense files directory, check point directory etc.
- Implemented Teradata MERGE statements in order to update huge tables thereby improving the performance of the application.
- Used reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
- Migrated the code into QA (Testing) and supported QA team and UAT (User).
- Working with Power Center Versioning (Check-in, Check-out), Querying to retrieve specific objects, maintaining the history of objects.
- Created mappings using transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Seuence Generator.
Environment: Informatica 9.1(Designer, Repository Manager, Workflow Manager, Workflow Monitor), Informatica 8x, Oracle 10G, Teradata 13/13.10, UNIX, Citrix, Toad, Putty, Developer, Power Exchange 9.2.1/8.6.1