We provide IT Staff Augmentation Services!

Sr. Data Stage Developer Resume

Glen Allen, VA

SUMMARY

  • 14+ years of IT experience in software development, designing and implementing with major focus on data warehousing and database applications.
  • 12+ Years of Strong data warehousing experience using ETL tools Data Stage, Informatica,Ab Initio
  • Proficient with Database relational models, stored procedures, and views.
  • 8+ Years of Experience in SQL/Advanced, PL/SQL programming to pulling Data from Various database.
  • 6+ Years of Experience of Korn shell sript program to run a jobs sequentially or sending data to another server using FTP commands.
  • 6 Years of experience in Development and Testing using IBM WebSphere DataStage 9.x/8.x
  • 3+ Years of experience in Development and Testing using Ab Initio.
  • 3+ Years of experience with Informatica Data Quality to q Business Objects using of Information Design tool for publish BI Reports.
  • 2+ Years of experience in Business Objects using of Information Design tool for publish BI Reports.
  • Extensive experience in Korn shell script program.
  • Data Processing experience in designing and implementing Data Mart applications, mainly transformation processes using ETL tool Data Stage 9.x/8.x, designing and developing jobs using Data Stage Designer, Data Stage Manager, Data Stage Director and Data Stage Debugger.
  • Efficient in all phases of the development lifecycle, coherent with Data Cleansing, Data Conversion, Performance Tuning and System Testing.
  • Proven experience in data modeling, data management, data warehousing, data transformation, metadata and reference data management and Erwin tools.
  • Skilled in designing, developing, implementing and maintaining applications in risk management, fraud control and online bill payment. working of exclusive experience on Big Data Technologies andHadoopstack and Loading Data from Oracle to Hadoop.
  • Extensive experience in using of Unix, Linux,SQL and Production support for Applications.
  • Created comprehensive application support documentation(Run book ) for use by L3 and L2 Support.
  • Tracked production issues using Service now Incidents and Tasks.
  • Possesses hands on Data experience with Netezza,SAP Hana, Oracle, Db2,SQL Server,Postgress.
  • Skilled in Informatica Application Support, Including Informatica Data Quality and Data Transformation.
  • Skilled in Ab Initio Application Support, including Ab initio Express>IT(BRE),
  • Working knowledge of Java/J2EE and SAS.
  • Worked on variousHadoopDistributions (Cloudera, Hortonworks, Amazon AWS) to implement and make use of those. working of exclusive experience in JIRA,RTC to Fill Task status in On - Line for Agile methodology.
  • Working experience to prepare BI reports using Informatica,Crystal Reports,BO based on Client repots requirement.
  • Viewed experience in using Korn Shell Scripting to maximize Ab Initio parallelism capabilities and developed numerous Ab-Initio Graphs using Data Parallelism and Multi File System (MFS) techniques.
  • Experience in design, development, and test applications using Ab Initio Software.
  • Worked with continuous components and XML components in Ab Initio.
  • Experience working with various Heterogeneous Source Systems like Netezza,SAP Hana, Oracle, SQL Server, Teradata, DB2 UDB and Flat files
  • Good knowledge in working with Relational database utilities like TOAD, SQL Assistant, SQL Loader.
  • Strong Knowledge with Batch Jobs automation tools including Autosys, Control-M.
  • Sound Knowledge in data warehouse concepts, STAR Schema, SNOW FLAKE Schema designs and dimensional modeling.

TECHNICAL SKILLS

Operating Systems: DOS, UNIX,LINUX, Windows: 95/98/2000, NT 4.0, XP

Database Management Systems: Oracle8i/9i,SQLServer2000,2005,MS Access,DB2,Teradata

Programming Languages: C, C++, Java Script, Cobol, SQL, PL/SQL,Korn Shell Script

GUI and Web Technologies: Visual Basic 5.0,6.0, VB.NET and HTML, XML, SAP BI BO 4.x

ETL Tools: IBM Web sphere Data Stage 9.x/8.x, Informatica 8.5,9.x,IDQ 9.x,AbInitio 1.14,1.15.6, Confidential >OpSystem 2.13,2.14

PROFESSIONAL EXPERIENCE

Confidential, Glen Allen, VA

Sr. Data stage Developer

Responsibilities:

  • Involved in preparing the Technical specifications and source target mapping document based on the functional specifications document.
  • Worked with the source system team (product processors) to understand/Standardize the data layout.
  • Analyzed the existing informational sources and methods to identify problem areas and make recommendations for improvement.
  • Helped in preparing the mapping document for source to target.
  • Worked with Data stage Manager for importing metadata from repository, new job Categories and creating new data elements.
  • Used Data Stage stages namely Hash file, Sequential file, Transformer, Aggregate, Sort, Datasets, Join, Lookup, Change Capture, Funnel, Peek, Copy,Dataset,merge,DB2 stage connector, Oracle connector,Nettezza in accomplishing the ETL Coding
  • Imported data from Terada database to HDFS, using specified compression Codecs using SQOOP.
  • Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.
  • Successfully loaded files to HDFS from Oracle, and loaded from HDFS to Netezza.
  • Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
  • Pre and post session assignment variables were used to pass the variable values from one session to other.
  • Reviewed and analyzed functional requirements, mapping documents, problem solving and trouble shooting.
  • Performed unit testing Confidential various levels of the ETL and actively involved in team code reviews.
  • Identified problems in existing production data and developed one time scripts to correct them.
  • Fixed the invalid mappings and troubleshoot the technical problems of the database.
  • Prepared the Unit test cases and Implementation plan and Code migration to Next Layers.
  • Used Business Objects for the reporting purposes.
  • Used Control -M job scheduler for automating the Daily, monthly regular run of DW cycle in both production and UAT environments.
  • Written Shell scripts (wrapper scripts) to handle data before executing sessions and to meet pre-requirements to run command session .
  • 24x7 Production Supportfor ETL jobs for daily, Monthly and Weekly schedules.

Environment: Data Stage 9.1,Oracle 12g, Hadoop,HDFS,Netezza,DB2, SAP BI BusObjects 4.X,Agnity, TOAD, SQuirreL SQL, Control -M, RationalTeam Concert (RTC), Putty, Linux, WinScp,Windows XP

Confidential, Tampa, FL

Sr. Data stage Developer

Responsibilities:

  • Involved in source to target data field mapping discussions.
  • Collaborated with business users, business analysts and project managers to translate the business requirements into technical specifications.
  • Involved in analysis of physical Data Model for ETL mapping and the process flow diagrams.
  • Responsible for creating High Level Design (HLD) and Application Interface Design (LLD) documents
  • Responsible in designing the schema with a special focus on User Interface Design
  • Responsible for conceptualizing and generating the need for Data Warehousing solutions as applicable to billing systems
  • Evaluated the Consistency and integrity of the model and repository
  • Used Data Stage Parallel Extender for parallel processing of data extraction and transformation.
  • Used Integrity & Parallel Extender for cleansing the data and performance improvement
  • Extensively used almost all of the transforms of Data Stage for various types of Date Conversions
  • Created partition primary indexes and secondary indexes for query performance in db2.
  • Responsible for Tuning the Data Stage Repository and Jobs for optimum performance
  • Extensively used Integrities existing wizards to remove duplicates
  • Parsed, Matched and removed duplicate records using Integrities built-in wizards
  • Scheduled and monitored automated weekly jobs
  • Performed the Unit testing of individual modules and their integration testing
  • Debugged and sorted out the errors and problems encountered in the production environment
  • Performed daily production support of the Production Data Warehouse.
  • Write Shell script running workflows in unix environment.
  • Optimizing performance tuning Confidential source, target, mapping and session level
  • Used Business Objects for the reporting purposes.
  • Participated in weekly status meetings, and conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings.

Environment: IBM Info sphere Data Stage 9.x, Oracle, Oracle SQL Developer, SAP BI Business Objects 4.X,Teradata, Control -M,Linux, WinScp, Putty,Linux Windows XP

Confidential, Frederick, MD

Sr. Data stage Developer/Analyst

Responsibilities:

  • Involved in source to target data field mapping discussions.
  • Worked with the source system team (product processors) to understand/Standardize the data layout.
  • Worked with other developers/BA to create a Best in class ETL solution .
  • Involved as primary on-site ETL Developer during the analysis, planning, design, development, and implementation stages of projects using IBM Info sphere Data Stage 9.x
  • Prepared Data Mapping Documents and Design the ETL jobs based on the DMD with required Tables in the Dev Environment.
  • Used DataStage as an ETL tool to extract data from sources systems, loaded the data into theORACLEdatabase,HDFS.
  • Designed and Developed Data stage Jobs to Extract data from heterogeneous sources, Applied transform logics to extracted data and Loaded into Data Warehouse Databases.
  • Created Data stage jobs using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify, Filter, Change Data Capture, Change Apply, Sample, Surrogate Key, Column Generator, Row Generator, Etc.
  • Extensively worked with Join, Look up (Normal and Sparse) and Merge stages.
  • Extensively worked with sequential file, dataset, file set and look up file set stages.
  • Extensively used Parallel Stages like Row Generator, Column Generator, Head, and Peek for development and de-bugging purposes.
  • Used the Data Stage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions on ad hoc or scheduled basis.
  • Developed complex store procedures using input/output parameters, cursors, views, triggers and complex queries using temp tables and joins.
  • Converted complex job designs to different job segments and executed through job sequencer for better performance and easy maintenance.
  • Creation of jobs sequences.
  • Maintained Data Warehouse by loading dimensions and facts as part of project. Also worked for different enhancements in FACT tables.
  • Created shell script to run data stage jobs from UNIX and then schedule this script to run data stage jobs through scheduling tool.
  • Coordinate with team members and administer all onsite and offshore work packages.
  • Analyze performance and monitor work with capacity planning.
  • Performed performance tuning of the jobs by interpreting performance statistics of the jobs developed.
  • Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.
  • Participated in weekly status meetings.
  • Developed Test Plan that included the scope of the release, entrance and exit criteria and overall test strategy. Created detailed Test Cases and Test sets and executed them manually.
  • Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.
  • Used debugger in identifying bugs in existing Jobs by analyzing data flow, evaluating transformations.
  • Performed unit testing Confidential various levels of the ETL and actively involved in team code reviews.
  • Identified problems in existing production data and developed one time scripts to correct them.
  • Fixed the invalid mappings and troubleshoot the technical problems of the database.
  • Used Business Objects(IDT) for the reporting purposes.

Environment: IBM Info sphere Data Stage 9.x, Oracle, DB2,Netezz, Hadoop,HDFS, SAP BI BusObjects 4.X,JIRA,Legacy System,Oracle,Teradata, TOAD, Putty, WinScp, Windows XP

Confidential, Franklin Lake, NJ

Sr.Informatica Developer/ Analyst

Responsibilities:

  • Involved in writing of detailed design document and Test cases preparation.
  • Closely interacted with the Business Analyst team for feed mapping documents.
  • Imported data from Oracle database to HDFS, using specified compression Codecs using SQOOP
  • Involved in Developing the Mappings,Mapplets,transformations, Sessions,Tasks,Workflows for the Coverage &Claims Compare applications.
  • Involved in all the stages of SDLC during the project. Analyzed, designed and tested the new system for performance, efficiency and maintainability using ETL tool Informatica Performance Techniques.
  • Extracted West Claim related Files and translated into WHSEFUL format and Compare with East Claims related WHSEFUL format files.
  • Development of source to staging and staging to EDW ETL mappings using DB2,Oracle Database Integration into Netezza Eterprise Data warehouse.
  • Designed and developed the Mappings using the Informatica Power Center, with components sorter, Normalizer,Router,Update Stagey,Next Sequence components.
  • Involved in data integration to combining data from several data sources.
  • Involved in data Migration from One Repository to another Repository based on Migration Requirement process or transferring data from one system to another while changing the storage, database or application using ETL Informatica.
  • Interacted in between development team and business users to identify defects, fix and deploy enhancements and bug fixes in production environment.
  • Involved in writing SQL - Join queries for data comparison.
  • Involved in Unit testing and System testing of Coverage application.
  • Involved in Claims Compare from West to East WHSEFUL files.
  • Wrote several UNIX control scripts, specific to application in order to pass the environment variables and sending a files to Shared drive.
  • To insert data into Teradata Data warehouse using utilities, FastLoad MultiLoad, BTEQ Scripts Tools.
  • Created comprehensive application support documentation(Run book ) for use by L3 and L2.
  • Coordinated production code and infrastructure changes using Remedy Change Management.
  • Used Business Objects for the reporting purposes.
  • Tracked production issues using Remedy Incidents and Tasks.

Environment: Informatica 9.0,Oracle, Postgress,UDB DB2, Teradata,, SAP BI BusObjects 4.X, TOAD,JIRA, Unix,WinScp, Putty, Windows XP

Confidential, Herndon, VA

Sr. Programmer Analyst

Responsibilities:

  • Closely interacted with the Business Analyst team for feed mapping documents.
  • Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
  • Create new mapping designs using various tools in Informatica Designer like Source Analyzer, Warehouse Designer, Mapplet Designer and Mapping Designer.
  • Develop the mappings using needed Transformations in Informatica tool according to technical specifications
  • Created complex mappings that involved implementation of Business Logic to load data in to staging area.
  • Used Informatica reusability Confidential various levels of development.
  • Developed mappings/sessions using Informatica Power Center 9.0 for data loading.
  • Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and Union.
  • Developed Workflows using task developer, Worklet designer and workflow designer in Workflow manager and monitored the results using workflow monitor.
  • Building Reports according to user Requirement.
  • Extracted data from Oracle and SQL Server then used Teradata for data warehousing.
  • Implementedslowly changing dimensionmethodology for accessing the full history of accounts.
  • Write Shell script running workflows in unix environment.
  • Optimizing performance tuning Confidential source, target,mapping and session level
  • Used Business Objects for the reporting purposes.
  • Participated inweeklystatus meetings, and conducting internal andexternal reviews as well as formal walk through among various teams and documenting the proceedings.

Environment: Informatica 9.0, Oracle, UDB DB2, SAP BI BusObjects 4.X, Postgres, Netezza, TOAD, Linux, Putty, SQL\PL-SQL, Windows 2000/XP, Remedy

Confidential, Warren, NJ

Sr.Data Stage Developer

Responsibilities:

  • Closely interacted with the Outbounds feeds team for feed mapping documents.
  • Active participation in decision making and QA meetings and regularly interacted with the Business Analysts &development team to gain a better understanding of the Business Process, Requirements & Design.
  • Involved in all the stages of SDLC during the project. Analyzed, designed and tested the new system for performance, efficiency and maintainability using ETL tool IBM Websphere Datastage.
  • Used DataStage as an ETL tool to extract data from sources systems, loaded the data into theORACLEdatabase.
  • Designed and Developed Data stage Jobs to Extract data from heterogeneous sources, Applied transform logics to extracted data and Loaded into Data Warehouse Databases.
  • Created Datastage jobs using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify, Filter, Change Data Capture, Change Apply, Sample, Surrogate Key, Column Generator, Row Generator, Etc.
  • Extensively worked with Join, Look up (Normal and Sparse) and Merge stages.
  • Extensively worked with sequential file, dataset, file set and look up file set stages.
  • Extensively used Parallel Stages like Row Generator, Column Generator, Head, and Peek for development and de-bugging purposes.
  • Used the Data Stage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions on ad hoc or scheduled basis.
  • Developed complex store procedures using input/output parameters, cursors, views, triggers and complex queries using temp tables and joins.
  • Converted complex job designs to different job segments and executed through job sequencer for better performance and easy maintenance.
  • Developed complex jobs using various stages like Lookup, Join, Transformer, Dataset, Row Generator, Column Generator, Datasets, Sequential File, Aggregator and Modify Stages.
  • Created queries using join and case statement to validate data in different databases.
  • Created queries to compare data between two databases to make sure data is matched.
  • Used the DataStage Director debugging its components, and monitoring the resulting executable versions on an ad hoc or scheduled basis.
  • Monitoring the Data stage job on daily basis by running the UNIX shell script and made a force start whenever job fails.
  • Created and modified batch scripts to ftp files from different server to data stage server.
  • Extensively used slowly changing dimension Type 2 approach to maintain history in database.
  • Involved in Unit Testing and Resolution of various Bottlenecks came across.
  • Implemented various Performance Tuning techniques.
  • Involved in writing SQL - Join queries for implementing business rules in Source Qualifier, Performance improvement purpose and data Analysis.
  • Involved inProduction implementationbest practices.

Environment: IBM Web Sphere Data Stage 9.x, AIX, Oracle, Netezza, Teradata,TOAD,SQLuirreL SQLRational Team Concert(RTC), Putty, SQL\PL-SQL Windows XP.

Confidential, Warren, NJ

Sr. Informatica Developer/Analyst

Responsibilities:

  • Fund Service Reporting (FSR) contains three modules i.e. Extracts, MultiFunds, Loads.
  • Developed high level and low level design document for processing each Fund Extracts and documenting the various implementation done during the branch of the FSR application.
  • Developed ETL Data Flow process for FSR Project by using Microsoft Visual Studio.
  • Involved in source to target data field mapping discussions, Production support.
  • Developed ETL programs using Informatica to implement the business requirements.
  • Communicated with business customers to discuss the issues and requirements.
  • Created shell scripts to fine tune the ETL flow of the Informatica workflows.
  • Used Informatica file watch events to pole the FTP sites for the external mainframe files.
  • Production Support has been done to resolve the ongoing issues and troubleshoot the problems.
  • Performance tuning was done Confidential the functional level and map level. Used relational SQL wherever possible to minimize the data transfer over the network.
  • Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.
  • Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.
  • Effectively worked in Informatica version based environment and used deployment groups to migrate the objects.
  • Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
  • Effectively worked on Onsite and Offshore work model.
  • Pre and post session assignment variables were used to pass the variable values from one session to other.
  • Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used informatica scheduler to schedule jobs.
  • Reviewed and analyzed functional requirements, mapping documents, problem solving and trouble shooting.
  • Performed unit testing Confidential various levels of the ETL and actively involved in team code reviews.
  • Identified problems in existing production data and developed one time scripts to correct them.
  • Fixed the invalid mappings and troubleshoot the technical problems of the database.
  • Involved in writing SQL queries to migrate an Extracts from DEV to UAT.
  • Involved in writing SQL - Join queries to need to know funds loaded or not into database.
  • Wrote several UNIX control scripts, specific to application in order to pass Files to another Server.
  • Responsible for extract a data(daily, weekly, monthly) from database and to maintain a historical data in Database for BI Reports.
  • To insert data into Teradata Data warehouse using utilities, Fast Load, Multi Load, BTEQ Scripts Tools.
  • Phasing and commute points were used in the graphs to avoid deadlock, control the subscriber’s messages and commit changes to the database Confidential every compute point.
  • Used Business Objects for the reporting purposes.
  • Involved inProduction migration and Production support on Scheduler basis

Environment: Informatica 8.5,Oracle 10g, UDB DB2, QTODBC, SQLDBX, Putty, SQL\PL-SQL,, Unix, Shell Scripts,Win Scp, XML

Confidential, Atlanta, GA

Data stage Developer

Responsibilities:

  • Interacted with End user community to understand the business requirements and in identifying data sources.
  • Analyzed the existing informational sources and methods to identify problem areas and make recommendations for improvement. This required a detailed understanding of the data sources and researching possible solutions.
  • Implemented dimensional model (logical and physical) in the existing architecture using Erwin.
  • Studied the PL/SQL code developed to relate the source and target mappings.
  • Helped in preparing the mapping document for source to target.
  • Worked with Datastage Manager for importing metadata from repository, new job Categories and creating new data elements.
  • Designed and developed ETL processes using DataStage designer to load data from Oracle, MS SQL, DB2,Flat Files (Fixed Width) and XML files to staging database and from staging to the target Data Warehouse database.
  • Used DataStage stages namely Hash file, Sequential file, Transformer, Aggregate, Sort, Datasets, Join, Lookup, Change Capture, Funnel, Peek, Row Generator stages in accomplishing the ETL Coding.
  • Developed job sequencer with proper job dependencies, job control stages, triggers.
  • Excessively used DS Director for monitoring Job logs to resolve issues.
  • Involved in performance tuning and optimization of DataStage mappings using features like Pipeline and Partition Parallelism and data/index cache to manage very large volume of data.
  • Documented ETL test plans, test cases, test scripts, and validations based on Smart Bear design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.
  • Used Control -M job scheduler for automating the Daily/monthly regular run of DW cycle in both production and UAT environments.
  • Verified the Cognos Report by extracting data from the Staging Database using PL/SQL queries.
  • Developed BO Reports by Using Information Designer Tool to View Reports.
  • Participated in Daily Hurdle(agile) and weekly status meetings.

Environment: IBM DataStage 8.x, Oracle 10g,SQL Server 2008, DB2 UDB, Flat files, Sequential files, RTC(Agile), Control -M, TOAD 9.6, SQL*Plus, Putty, AIX UNIX, Business Objects IDT 4.X

Confidential, Memphis, TN

Sr. Ab Initio Developer

Responsibilities:

  • Development of source data profiling and analysis - review of data content and metadata will facilitate data mapping, and validate assumptions that were made in the business requirements.
  • Created the mini specs for different applications.
  • Involved in review the data Analysis, best practices.
  • Developed various Ab-Initio graphs to validate using data profiler, comparing the current data with previous month data.
  • Used different Ab-Initio components like partition by key and sort, dedup, rollup, scan, reformat, join and fuse in various graphs.
  • Also used components like run program and run sql components to run UNIX and SQL commands in Ab Initio.
  • Wrote Unix control scripts, specific to application in order to pass the environment variables.
  • Responsible for extracting daily text files from ftp server and historical data from DB2 Tables, cleansing the data and applying transformation rules and loading to staging area.
  • Used Ab initio as ETL tool to pull data from source systems, cleanse, transform, and load data into databases.
  • Involved in design best practices and Coding and documentation best practices.
  • Involved in writing Procedures, Functions, and Packages to uploading data from Database.
  • Wrote the .dbc files for the Development, Testing and Production Environments.
  • Expertise in unit testing, system testing using of sample data, generate data, manipulate date and verify the functional, data quality, performance of graphs.
  • Performing transformations of source data with transform components like join, match sorted, dedup sorted, de-normalize, reformat, and filter-by- expression.
  • Wide usage of lookup files while getting data from multiple sources and size of the data is limited.
  • Wrote Unix control scripts, specific to application in order to pass the environment variables.
  • Development of UNIX Korn shell scripts to automate job runs or Support the redaction infrastructure and SQL and PL/SQL Procedures to load the Data into Database.
  • Involved in project promotion from development to UAT and UAT to promotion.
  • Involved inProduction implementationbest practices.
  • Used BI tool SSRS for the reporting purposes.
  • Used different EME Air commands in project promotion like air tag create, air save, air load, air project export

Environment: AbInitio GDE1.14, Confidential -Op2.14, SQL Server 2008,UNIX, Oracle 10.x,TOAD,UDB DB2, SQL Server 2005,Windows’ XP, Teradata, Auto Sys

Confidential, Harrison, NY

Ab Initio Developer

Responsibilities:

  • Involved as designer and developer for commercial business group data warehouse (CBGDWH).
  • Development of source data profiling and analysis - review of data content and metadata will facilitate data mapping, and validate assumptions that were made in the business requirements.
  • Created the minispecs for different applications.
  • Automated the development of Ab initio graphs and functions utilizing the meta data from EME to meet SB data redaction requirements.
  • Developed various Ab Initio graphs to validate using data profiler, comparing the current data with previous month data, applying the AMS standards.
  • Involved in AMS installation in dev, testing the AMS and promote to SIT, SIT to UAT and UAT to production.
  • Used Different Ab-Initio components like partition by key and sort, dedup, rollup, scan, reformat, join and fuse in various graphs.
  • Also used components like run program and run sql components to run UNIX and SQL commands in Ab-Initio.
  • Used USPS address mapping system for correcting the customer address.
  • Involved in writing a Procedures, Functions, and Packages to uploading a Data from Database.
  • Performing transformations of source data with transform components like join, match sorted, and dedup sorted, reformat, and filter-by- expression.
  • Wide usage of lookup files while getting data from multiple sources and size of the data is limited.
  • Using modification of the Ab Intio EME to house the required redaction Mata data.
  • Used different EME air commands in project promotion like air tag create, air save, air load, air project export

Environment: Ab Initio GDE1.13, Confidential -Op2.13, UNIX, Oracle 9i, SQL Server Navigator 5.0,SQL Server2000, Cygwin, AMS Software, Maestro,UDB DB2, Teradata, Windows2000, Crystal Reports 9.0

Confidential, Plano, TX

Informatica Developer

Responsibilities:

  • This project consisted of Confidential &T products information, sales information and billing information.
  • All accounts within a given billing product subscription must be billed to the same customer.
  • Involved with business users to prepare functional specification and technical design documents for ETL process for complete data warehouse cycle for Confidential &T wireless customer support and sales.
  • Extensively used ETL to load data from Flat Files, XML, Oracle to oracle 8i
  • Involved in Designing of Data Modeling for the Data warehouse
  • Involved in Requirement Gathering and Business Analysis
  • Developed data Mappings between source systems and warehouse components using Mapping Designer
  • Worked extensively on different types of transformations like source qualifier, expression, filter, aggregator, rank, update strategy, lookup, stored procedure, sequence generator, joiner, XML.
  • Setup folders, groups, users, and permissions and performed Repository administration using Repository Manager.
  • Involved in the performance tuning of the Informatica mappings and stored procedures and the sequel queries inside the source qualifier.
  • Created, launched & scheduled sessions.
  • Involved in the Performance Tuning of Database and Informatica. Improved performance by identifying and rectifying the performance bottle necks.
  • Used Server Manager to schedule sessions and batches.
  • Involved in creating Business Objects Universe and appropriate reports
  • Wrote PL/SQL Packages and Stored procedures to implement business rules and validations.

Environment: Informatica 7.0, Oracle 9i, SQL Server Navigator 5.0, DB2, Teradata, Windows2000, UNIX

Confidential, Columbus, OH

Ab Initio Developer

Responsibilities:

  • The largest publicly held personal auto insurance in U.S. The Nationwide Insurance financial pilot module is mainly to convert already existing PL/SQL procedures to Ab Initio graphs. Applications are aimed to pull the data from legacy data source (EDW) to landing zone.
  • Extracted data from oracle database and the extracted data are used to populate the data warehouse tables.
  • Used Ab Initio as an ETL tool to pull data from source systems, cleansing, transform and load data into databases.
  • Involved in Ab Initio graph design and performance tuning to load graph process.
  • Convert the date formats from yymmdd to Oracle standard date format.
  • Used Ab Initio repository to store the developed graphs, for future dependency analysis when needed.
  • Uncompress the source data using the GUNZIP component and translate the data from EBCDIC to ASCII.
  • Responsible for creating test cases to make the data originating from the source was making it into target properly in the right format.
  • Created Ab Initio multi file systems (MFS) to take advantage of partition parallelism.
  • Implemented a 4-Way multi file system that was composed of the individual files on different nodes that were partitioned and stored in distributed directories.
  • Designed and developed Ab Initio graphs, using different components viz. Reformat, rollups, scan, and Join etc. Performed functional testing of graphs.
  • Automated both monthly and weekly refresh (data load) using the cron utility.
  • Created Crontab jobs to run the different application Confidential a time.

Environment: Ab Initio GDE1.12, Confidential -Op 2.12, UNIX, PL/SQL, Oracle 9i, SQL Server7.0, WindowNT

Confidential, San Francisco, CA

Ab Initio Developer

Responsibilities:

  • Created Ab Initio graphs to load large volume of data around several GB to TB.
  • Used the Ab Initio Web Interface to Navigate the EME to view graphs, files and datasets and examine the dependencies among objects.
  • Extracted data from Oracle and used them to populate Teradata Data Warehouse tables associated with Data Mart.
  • Created Korn Shell scripts and cron jobs to refresh the load on weekly basis.
  • Developed complex Ab Initio XFR’s to derive new fields and solve various business requirements.
  • Created test scenarios that were used to validate the Ab Initio graphs.
  • Responsible for cleansing the data from source systems using Ab Initio components such as reformat and filter by expression.
  • Used the sub graphs to increase the clarity of graph and to impose reusable business restrictions and tested various graphs for their functionalities.
  • Developed several partition based Ab Initio Graphs for high volume data warehouse.

Environment: Ab Initio GDE1.12, Confidential -Op 2.12, UNIX, PL/SQL, Oracle 9i, SQL Server7.0, Windowing

Hire Now