Sr. Data Stage Developer Resume
Glen Allen, VA
SUMMARY
- 14+ years of IT experience in software development, designing and implementing wif major focus on data warehousing and database applications.
- 12+ Years of Strong data warehousing experience using ETL tools Data Stage, Informatica,Ab Initio
- Proficient wif Database relational models, stored procedures, and views.
- 8+ Years of Experience in SQL/Advanced, PL/SQL programming to pulling Data from Various database.
- 6+ Years of Experience of Korn shell sript program to run a jobs sequentially or sending data to another server using FTP commands.
- 6 Years of experience in Development and Testing using IBM WebSphere DataStage 9.x/8.x
- 3+ Years of experience in Development and Testing using Ab Initio.
- 3+ Years of experience wif Informatica Data Quality to q Business Objects using of Information Design tool for publish BI Reports.
- 2+ Years of experience in Business Objects using of Information Design tool for publish BI Reports.
- Extensive experience in Korn shell script program.
- Data Processing experience in designing and implementing Data Mart applications, mainly transformation processes using ETL tool Data Stage 9.x/8.x, designing and developing jobs using Data Stage Designer, Data Stage Manager, Data Stage Director and Data Stage Debugger.
- Efficient in all phases of teh development lifecycle, coherent wif Data Cleansing, Data Conversion, Performance Tuning and System Testing.
- Proven experience in data modeling, data management, data warehousing, data transformation, metadata and reference data management and Erwin tools.
- Skilled in designing, developing, implementing and maintaining applications in risk management, fraud control and online bill payment. working of exclusive experience on Big Data Technologies andHadoopstack and Loading Data from Oracle to Hadoop.
- Extensive experience in using of Unix, Linux,SQL and Production support for Applications.
- Created comprehensive application support documentation(Run book ) for use by L3 and L2 Support.
- Tracked production issues using Service now Incidents and Tasks.
- Possesses hands on Data experience wif Netezza,SAP Hana, Oracle, Db2,SQL Server,Postgress.
- Skilled in Informatica Application Support, Including Informatica Data Quality and Data Transformation.
- Skilled in Ab Initio Application Support, including Ab initio Express>IT(BRE),
- Working noledge of Java/J2EE and SAS.
- Worked on variousHadoopDistributions (Cloudera, Hortonworks, Amazon AWS) to implement and make use of those. working of exclusive experience in JIRA,RTC to Fill Task status in On - Line for Agile methodology.
- Working experience to prepare BI reports using Informatica,Crystal Reports,BO based on Client repots requirement.
- Viewed experience in using Korn Shell Scripting to maximize Ab Initio parallelism capabilities and developed numerous Ab-Initio Graphs using Data Parallelism and Multi File System (MFS) techniques.
- Experience in design, development, and test applications using Ab Initio Software.
- Worked wif continuous components and XML components in Ab Initio.
- Experience working wif various Heterogeneous Source Systems like Netezza,SAP Hana, Oracle, SQL Server, Teradata, DB2 UDB and Flat files,.
- Good noledge in working wif Relational database utilities like TOAD, SQL Assistant, SQL Loader.
- Strong Knowledge wif Batch Jobs automation tools including Autosys, Control-M.
- Sound Knowledge in data warehouse concepts, STAR Schema, SNOW FLAKE Schema designs and dimensional modeling.
TECHNICAL SKILLS
Operating Systems: DOS, UNIX,LINUX, Windows: 95/98/2000, NT 4.0, XP
Database Management Systems: Oracle8i/9i,SQLServer2000,2005,MS Access,DB2,Teradata
Programming Languages: C, C++, Java Script, Cobol, SQL, PL/SQL,Korn Shell Script
GUI and Web Technologies: Visual Basic 5.0,6.0, VB.NET and HTML, XML, SAP BI BO 4.x
ETL Tools: IBM Web sphere Data Stage 9.x/8.x, Informatica 8.5,9.x,IDQ 9.x,AbInitio 1.14,1.15.6, Co>OpSystem 2.13,2.14
PROFESSIONAL EXPERIENCE
Confidential, Glen Allen, VA
Sr. Data stage Developer
Responsibilities:
- Involved in preparing teh Technical specifications and source target mapping document based on teh functional specifications document.
- Worked wif teh source system team (product processors) to understand/Standardize teh data layout.
- Analyzed teh existing informational sources and methods to identify problem areas and make recommendations for improvement.
- Helped in preparing teh mapping document for source to target.
- Worked wif Data stage Manager for importing metadata from repository, new job Categories and creating new data elements.
- Used Data Stage stages namely Hash file, Sequential file, Transformer, Aggregate, Sort, Datasets, Join, Lookup, Change Capture, Funnel, Peek, Copy,Dataset,merge,DB2 stage connector, Oracle connector,Nettezza in accomplishing teh ETL Coding
- Imported data from Terada database to HDFS, using specified compression Codecs using SQOOP.
- Involved in enhancements and maintenance activities of teh data warehouse including tuning, modifying of stored procedures for code enhancements.
- Successfully loaded files to HDFS from Oracle, and loaded from HDFS to Netezza.
- Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
- Pre and post session assignment variables were used to pass teh variable values from one session to other.
- Reviewed and analyzed functional requirements, mapping documents, problem solving and trouble shooting.
- Performed unit testing at various levels of teh ETL and actively involved in team code reviews.
- Identified problems in existing production data and developed one time scripts to correct them.
- Fixed teh invalid mappings and troubleshoot teh technical problems of teh database.
- Prepared teh Unit test cases and Implementation plan and Code migration to Next Layers.
- Used Business Objects for teh reporting purposes.
- Used Control -M job scheduler for automating teh Daily, monthly regular run of DW cycle in both production and UAT environments.
- Written Shell scripts (wrapper scripts) to handle data before executing sessions and to meet pre-requirements to run command session .
- 24x7 Production Supportfor ETL jobs for daily, Monthly and Weekly schedules.
Environment: Data Stage 9.1,Oracle 12g, Hadoop,HDFS,Netezza,DB2, SAP BI BusObjects 4.X,Agnity, TOAD, SQuirreL SQL, Control -M, RationalTeam Concert (RTC), Putty, Linux, WinScp,Windows XP
Confidential, Tampa, FL
Sr. Data stage Developer
Responsibilities:
- Involved in source to target data field mapping discussions.
- Collaborated wif business users, business analysts and project managers to translate teh business requirements into technical specifications.
- Involved in analysis of physical Data Model for ETL mapping and teh process flow diagrams.
- Responsible for creating High Level Design (HLD) and Application Interface Design (LLD) documents
- Responsible in designing teh schema wif a special focus on User Interface Design
- Responsible for conceptualizing and generating teh need for Data Warehousing solutions as applicable to billing systems
- Evaluated teh Consistency and integrity of teh model and repository
- Used Data Stage Parallel Extender for parallel processing of data extraction and transformation.
- Used Integrity & Parallel Extender for cleansing teh data and performance improvement
- Extensively used almost all of teh transforms of Data Stage for various types of Date Conversions
- Created partition primary indexes and secondary indexes for query performance in db2.
- Responsible for Tuning teh Data Stage Repository and Jobs for optimum performance
- Extensively used Integrities existing wizards to remove duplicates
- Parsed, Matched and removed duplicate records using Integrities built-in wizards
- Scheduled and monitored automated weekly jobs
- Performed teh Unit testing of individual modules and their integration testing
- Debugged and sorted out teh errors and problems encountered in teh production environment
- Performed daily production support of teh Production Data Warehouse.
- Write Shell script running workflows in unix environment.
- Optimizing performance tuning at source, target, mapping and session level
- Used Business Objects for teh reporting purposes.
- Participated in weekly status meetings, and conducting internal and external reviews as well as formal walk through among various teams and documenting teh proceedings.
Environment: IBM Info sphere Data Stage 9.x, Oracle, Oracle SQL Developer, SAP BI Business Objects 4.X,Teradata, Control -M,Linux, WinScp, Putty,Linux Windows XP
Confidential, Frederick, MD
Sr. Data stage Developer/Analyst
Responsibilities:
- Involved in source to target data field mapping discussions.
- Worked wif teh source system team (product processors) to understand/Standardize teh data layout.
- Worked wif other developers/BA to create a Best in class ETL solution .
- Involved as primary on-site ETL Developer during teh analysis, planning, design, development, and implementation stages of projects using IBM Info sphere Data Stage 9.x
- Prepared Data Mapping Documents and Design teh ETL jobs based on teh DMD wif required Tables in teh Dev Environment.
- Used DataStage as an ETL tool to extract data from sources systems, loaded teh data into theORACLEdatabase,HDFS.
- Designed and Developed Data stage Jobs to Extract data from heterogeneous sources, Applied transform logics to extracted data and Loaded into Data Warehouse Databases.
- Created Data stage jobs using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify, Filter, Change Data Capture, Change Apply, Sample, Surrogate Key, Column Generator, Row Generator, Etc.
- Extensively worked wif Join, Look up (Normal and Sparse) and Merge stages.
- Extensively worked wif sequential file, dataset, file set and look up file set stages.
- Extensively used Parallel Stages like Row Generator, Column Generator, Head, and Peek for development and de-bugging purposes.
- Used teh Data Stage Director and its run-time engine to schedule running teh solution, testing and debugging its components, and monitoring teh resulting executable versions on ad hoc or scheduled basis.
- Developed complex store procedures using input/output parameters, cursors, views, triggers and complex queries using temp tables and joins.
- Converted complex job designs to different job segments and executed through job sequencer for better performance and easy maintenance.
- Creation of jobs sequences.
- Maintained Data Warehouse by loading dimensions and facts as part of project. Also worked for different enhancements in FACT tables.
- Created shell script to run data stage jobs from UNIX and tan schedule dis script to run data stage jobs through scheduling tool.
- Coordinate wif team members and administer all onsite and offshore work packages.
- Analyze performance and monitor work wif capacity planning.
- Performed performance tuning of teh jobs by interpreting performance statistics of teh jobs developed.
- Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.
- Participated in weekly status meetings.
- Developed Test Plan that included teh scope of teh release, entrance and exit criteria and overall test strategy. Created detailed Test Cases and Test sets and executed them manually.
- Involved in enhancements and maintenance activities of teh data warehouse including tuning, modifying of stored procedures for code enhancements.
- Used debugger in identifying bugs in existing Jobs by analyzing data flow, evaluating transformations.
- Performed unit testing at various levels of teh ETL and actively involved in team code reviews.
- Identified problems in existing production data and developed one time scripts to correct them.
- Fixed teh invalid mappings and troubleshoot teh technical problems of teh database.
- Used Business Objects(IDT) for teh reporting purposes.
Environment: IBM Info sphere Data Stage 9.x, Oracle, DB2,Netezz, Hadoop,HDFS, SAP BI BusObjects 4.X,JIRA,Legacy System,Oracle,Teradata, TOAD, Putty, WinScp, Windows XP
Confidential, Franklin Lake, NJ
Sr.Informatica Developer/ Analyst
Responsibilities:
- Involved in writing of detailed design document and Test cases preparation.
- Closely interacted wif teh Business Analyst team for feed mapping documents.
- Imported data from Oracle database to HDFS, using specified compression Codecs using SQOOP
- Involved in Developing teh Mappings,Mapplets,transformations, Sessions,Tasks,Workflows for teh Confidential &Claims Compare applications.
- Involved in all teh stages of SDLC during teh project. Analyzed, designed and tested teh new system for performance, efficiency and maintainability using ETL tool Informatica Performance Techniques.
- Extracted West Claim related Files and translated into WHSEFUL format and Compare wif East Claims related WHSEFUL format files.
- Development of source to staging and staging to EDW ETL mappings using DB2,Oracle Database Integration into Netezza Eterprise Data warehouse.
- Designed and developed teh Mappings using teh Informatica Power Center, wif components sorter, Normalizer,Router,Update Stagey,Next Sequence components.
- Involved in data integration to combining data from several data sources.
- Involved in data Migration from One Repository to another Repository based on Migration Requirement process or transferring data from one system to another while changing teh storage, database or application using ETL Informatica.
- Interacted in between development team and business users to identify defects, fix and deploy enhancements and bug fixes in production environment.
- Involved in writing SQL - Join queries for data comparison.
- Involved in Unit testing and System testing of Confidential application.
- Involved in Claims Compare from West to East WHSEFUL files.
- Wrote several UNIX control scripts, specific to application in order to pass teh environment variables and sending a files to Shared drive.
- To insert data into Teradata Data warehouse using utilities, FastLoad MultiLoad, BTEQ Scripts Tools.
- Created comprehensive application support documentation(Run book ) for use by L3 and L2.
- Coordinated production code and infrastructure changes using Remedy Change Management.
- Used Business Objects for teh reporting purposes.
- Tracked production issues using Remedy Incidents and Tasks.
Environment: Informatica 9.0,Oracle, Postgress,UDB DB2, Teradata,, SAP BI BusObjects 4.X, TOAD,JIRA, Unix,WinScp, Putty, Windows XP
Confidential, Herndon, VA
Sr. Programmer Analyst
Responsibilities:
- Closely interacted wif teh Business Analyst team for feed mapping documents.
- Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
- Create new mapping designs using various tools in Informatica Designer like Source Analyzer, Warehouse Designer, Mapplet Designer and Mapping Designer.
- Develop teh mappings using needed Transformations in Informatica tool according to technical specifications
- Created complex mappings that involved implementation of Business Logic to load data in to staging area.
- Used Informatica reusability at various levels of development.
- Developed mappings/sessions using Informatica Power Center 9.0 for data loading.
- Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and Union.
- Developed Workflows using task developer, Worklet designer and workflow designer in Workflow manager and monitored teh results using workflow monitor.
- Building Reports according to user Requirement.
- Extracted data from Oracle and SQL Server tan used Teradata for data warehousing.
- Implementedslowly changing dimensionmethodology for accessing teh full history of accounts.
- Write Shell script running workflows in unix environment.
- Optimizing performance tuning at source, target,mapping and session level
- Used Business Objects for teh reporting purposes.
- Participated inweeklystatus meetings, and conducting internal andexternal reviews as well as formal walk through among various teams and documenting teh proceedings.
Environment: Informatica 9.0, Oracle, UDB DB2, SAP BI BusObjects 4.X, Postgres, Netezza, TOAD, Linux, Putty, SQL\PL-SQL, Windows 2000/XP, Remedy
Confidential, Warren, NJ
Sr.Data Stage Developer
Responsibilities:
- Closely interacted wif teh Outbounds feeds team for feed mapping documents.
- Active participation in decision making and QA meetings and regularly interacted wif teh Business Analysts &development team to gain a better understanding of teh Business Process, Requirements & Design.
- Involved in all teh stages of SDLC during teh project. Analyzed, designed and tested teh new system for performance, efficiency and maintainability using ETL tool IBM Websphere Datastage.
- Used DataStage as an ETL tool to extract data from sources systems, loaded teh data into theORACLEdatabase.
- Designed and Developed Data stage Jobs to Extract data from heterogeneous sources, Applied transform logics to extracted data and Loaded into Data Warehouse Databases.
- Created Datastage jobs using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify, Filter, Change Data Capture, Change Apply, Sample, Surrogate Key, Column Generator, Row Generator, Etc.
- Extensively worked wif Join, Look up (Normal and Sparse) and Merge stages.
- Extensively worked wif sequential file, dataset, file set and look up file set stages.
- Extensively used Parallel Stages like Row Generator, Column Generator, Head, and Peek for development and de-bugging purposes.
- Used teh Data Stage Director and its run-time engine to schedule running teh solution, testing and debugging its components, and monitoring teh resulting executable versions on ad hoc or scheduled basis.
- Developed complex store procedures using input/output parameters, cursors, views, triggers and complex queries using temp tables and joins.
- Converted complex job designs to different job segments and executed through job sequencer for better performance and easy maintenance.
- Developed complex jobs using various stages like Lookup, Join, Transformer, Dataset, Row Generator, Column Generator, Datasets, Sequential File, Aggregator and Modify Stages.
- Created queries using join and case statement to validate data in different databases.
- Created queries to compare data between two databases to make sure data is matched.
- Used teh DataStage Director debugging its components, and monitoring teh resulting executable versions on an ad hoc or scheduled basis.
- Monitoring teh Data stage job on daily basis by running teh UNIX shell script and made a force start whenever job fails.
- Created and modified batch scripts to ftp files from different server to data stage server.
- Extensively used slowly changing dimension Type 2 approach to maintain history in database.
- Involved in Unit Testing and Resolution of various Bottlenecks came across.
- Implemented various Performance Tuning techniques.
- Involved in writing SQL - Join queries for implementing business rules in Source Qualifier, Performance improvement purpose and data Analysis.
- Involved inProduction implementationbest practices.
Environment: IBM Web Sphere Data Stage 9.x, AIX, Oracle, Netezza, Teradata,TOAD,SQLuirreL SQLRational Team Concert(RTC), Putty, SQL\PL-SQL Windows XP.
Confidential, Warren, NJ
Sr. Informatica Developer/Analyst
Responsibilities:
- Fund Service Reporting (FSR) contains three modules me.e. Extracts, MultiFunds, Loads.
- Developed high level and low level design document for processing each Fund Extracts and documenting teh various implementation done during teh branch of teh FSR application.
- Developed ETL Data Flow process for FSR Project by using Microsoft Visual Studio.
- Involved in source to target data field mapping discussions, Production support.
- Developed ETL programs using Informatica to implement teh business requirements.
- Communicated wif business customers to discuss teh issues and requirements.
- Created shell scripts to fine tune teh ETL flow of teh Informatica workflows.
- Used Informatica file watch events to pole teh FTP sites for teh external mainframe files.
- Production Support has been done to resolve teh ongoing issues and troubleshoot teh problems.
- Performance tuning was done at teh functional level and map level. Used relational SQL wherever possible to minimize teh data transfer over teh network.
- TEMPEffectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.
- Involved in enhancements and maintenance activities of teh data warehouse including tuning, modifying of stored procedures for code enhancements.
- TEMPEffectively worked in Informatica version based environment and used deployment groups to migrate teh objects.
- Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
- TEMPEffectively worked on Onsite and Offshore work model.
- Pre and post session assignment variables were used to pass teh variable values from one session to other.
- Designed workflows wif many sessions wif decision, assignment task, event wait, and event raise tasks, used informatica scheduler to schedule jobs.
- Reviewed and analyzed functional requirements, mapping documents, problem solving and trouble shooting.
- Performed unit testing at various levels of teh ETL and actively involved in team code reviews.
- Identified problems in existing production data and developed one time scripts to correct them.
- Fixed teh invalid mappings and troubleshoot teh technical problems of teh database.
- Involved in writing SQL queries to migrate an Extracts from DEV to UAT.
- Involved in writing SQL - Join queries to need to no funds loaded or not into database.
- Wrote several UNIX control scripts, specific to application in order to pass Files to another Server.
- Responsible for extract a data(daily, weekly, monthly) from database and to maintain a historical data in Database for BI Reports.
- To insert data into Teradata Data warehouse using utilities, Fast Load, Multi Load, BTEQ Scripts Tools.
- Phasing and commute points were used in teh graphs to avoid deadlock, control teh subscriber’s messages and commit changes to teh database at every compute point.
- Used Business Objects for teh reporting purposes.
- Involved inProduction migration and Production support on Scheduler basis
Environment: Informatica 8.5,Oracle 10g, UDB DB2, QTODBC, SQLDBX, Putty, SQL\PL-SQL,, Unix, Shell Scripts,Win Scp, XML
Confidential, Atlanta, GA
Data stage Developer
Responsibilities:
- Interacted wif End user community to understand teh business requirements and in identifying data sources.
- Analyzed teh existing informational sources and methods to identify problem areas and make recommendations for improvement. dis required a detailed understanding of teh data sources and researching possible solutions.
- Implemented dimensional model (logical and physical) in teh existing architecture using Erwin.
- Studied teh PL/SQL code developed to relate teh source and target mappings.
- Helped in preparing teh mapping document for source to target.
- Worked wif Datastage Manager for importing metadata from repository, new job Categories and creating new data elements.
- Designed and developed ETL processes using DataStage designer to load data from Oracle, MS SQL, DB2,Flat Files (Fixed Width) and XML files to staging database and from staging to teh target Data Warehouse database.
- Used DataStage stages namely Hash file, Sequential file, Transformer, Aggregate, Sort, Datasets, Join, Lookup, Change Capture, Funnel, Peek, Row Generator stages in accomplishing teh ETL Coding.
- Developed job sequencer wif proper job dependencies, job control stages, triggers.
- Excessively used DS Director for monitoring Job logs to resolve issues.
- Involved in performance tuning and optimization of DataStage mappings using features like Pipeline and Partition Parallelism and data/index cache to manage very large volume of data.
- Documented ETL test plans, test cases, test scripts, and validations based on Smart Bear design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.
- Used Control -M job scheduler for automating teh Daily/monthly regular run of DW cycle in both production and UAT environments.
- Verified teh Cognos Report by extracting data from teh Staging Database using PL/SQL queries.
- Developed BO Reports by Using Information Designer Tool to View Reports .
- Participated in Daily Hurdle(agile) and weekly status meetings.
Environment: IBM DataStage 8.x, Oracle 10g,SQL Server 2008, DB2 UDB, Flat files, Sequential files, RTC(Agile), Control -M, TOAD 9.6, SQL*Plus, Putty, AIX UNIX, Business Objects IDT 4.X
Confidential, Cash Memphis, TN
Sr. Ab Initio Developer
Responsibilities:
- Development of source data profiling and analysis - review of data content and metadata will facilitate data mapping, and validate assumptions that were made in teh business requirements.
- Created teh mini specs for different applications.
- Involved in review teh data Analysis, best practices.
- Developed various Ab-Initio graphs to validate using data profiler, comparing teh current data wif previous month data.
- Used different Ab-Initio components like partition by key and sort, dedup, rollup, scan, reformat, join and fuse in various graphs.
- Also used components like run program and run sql components to run UNIX and SQL commands in Ab Initio.
- Wrote Unix control scripts, specific to application in order to pass teh environment variables.
- Responsible for extracting daily text files from ftp server and historical data from DB2 Tables, cleansing teh data and applying transformation rules and loading to staging area.
- Used Ab initio as ETL tool to pull data from source systems, cleanse, transform, and load data into databases.
- Involved in design best practices and Coding and documentation best practices.
- Involved in writing Procedures, Functions, and Packages to uploading data from Database.
- Wrote teh .dbc files for teh Development, Testing and Production Environments.
- Expertise in unit testing, system testing using of sample data, generate data, manipulate date and verify teh functional, data quality, performance of graphs.
- Performing transformations of source data wif transform components like join, match sorted, dedup sorted, de-normalize, reformat, and filter-by- expression.
- Wide usage of lookup files while getting data from multiple sources and size of teh data is limited.
- Wrote Unix control scripts, specific to application in order to pass teh environment variables.
- Development of UNIX Korn shell scripts to automate job runs or Support teh redaction infrastructure and SQL and PL/SQL Procedures to load teh Data into Database.
- Involved in project promotion from development to UAT and UAT to promotion.
- Involved inProduction implementationbest practices.
- Used BI tool SSRS for teh reporting purposes.
- Used different EME Air commands in project promotion like air tag create, air save, air load, air project export
Environment: AbInitio GDE1.14,Co-Op2.14, SQL Server 2008,UNIX, Oracle 10.x,TOAD,UDB DB2, SQL Server 2005,Windows’ XP, Teradata, Auto Sys