- 6+ years of IT experience in Analysis, Design, Development, Implementation, and Testing of Client/Server, and Data Warehousing & OLAP Applications using ETL Tools. Extensive experience using Data Stage stages such as Lookup, Join, Merge, Funnel, Remove duplicates, Aggregator, DB2 Enterprise, Oracle Enterprise, Sequential file, Dataset, File Set. Extensive experience with data warehousing and business intelligence applications with participation in entire life cycle of projects. Extensive knowledge in Pipeline/Partition concepts and techniques - Round Robin, Entire, Hash by Field, Modulus and Range. Experience with ETL design and development of Data Marts /Enterprise wide Data Warehouses / Operational Data Stores. Proficiency in object-oriented PERL including Moose Used Job Sequencer to specify the network/sequence of jobs to run. Extensive experience in various parallel stages (Sort, Merge, Aggregator, and Transformer, Funnel, Filter, Look-up, Surrogate Key, Remove Duplicate, Transformer, Column/Row Generator and Peek) and Mainframe stages (Aggregator, Join, Look-up, Fixed Width Flat File). Experience of PERL for File handling and Regular expressions for parsing sensitive information and DBI/DBD module for Sybase connection from PERL scripts for information storage. Development of Extraction, Transformation and Loading (ETL) solutions to meet the business needs of our clients and in providing technical leadership to outsourcing team members. Worked with varied Stages like Lookup, Join Merge, Sort, Copy, Filter, Transformer, Aggregator and Funnel Stage. Proficient in Teradata Database Design (Physical and Logical), Application Support, Performance Tuning, Optimization, User & Security Administration, Data Administration and setting up the environments. Monitoring Teradata CPU/Load utilization, Space Management and System Health Trend periodically for the future enhancements and pipeline projects.
Operating System: Windows 2000/XP/Vista, Unix and Linux
Databases: Teradata 13.0, Oracle 11g, SQL Server, DB2, Mainframes
Scripting languages: Shell Script, PL/SQL, SQL, C, C++, COBOL, TSO
Tools: Query man, Designer, Developer 6i; Erwin, SQL Loader; Oracle, Toad, Query Analyzer, OEM.
Warehousing Tools: IBM Information Server, IBM Information Analyzer, Data Stage
Testing Tools: Win Runner 7.0, LoadRunner7.0, Test Director 7, Rational Clear Quest, Jira
Other Tools: MSOffice, MSWord, MS Excel, MS Project.
Data Stage Developer
- Devised extraction, transformation and load schemes for the process.
- Developed Data stage design concepts, execution, testing and deployment on the client server.
- Segmented data according to the categories provided by the client.
- Developed components and functions for data masking, encoding and decoding.
- Performance tuned and resolved scratch disk memory errors.
- Optimized the search function and creating custom search patterns.
- Executed Product Development Life Cycle (PDLC) process for different projects.
- Developed project schedules and ETL process according to the project requirements.
- Sent the data from Data stage to NDM (Network Data Mover) to Hadoop.
- Created one framework in data stage with all tables.
- Wrote applications testing guideline and methods.
- Wrote function codes for different tasks in the database table.
- Designed and developed Data stage PX ETL jobs (Parallel jobs and job sequencers).
- Developed Parallel jobs using Stages, which includes Join, Transformer, Sort, Merge, Filter and Lookup.
- Extracted merchant data from XML documents using XML input stage.
- Remove Duplicates stage in PX (EE) was used to remove duplicates in the data.
- Created job sequences. Created job schedules to automate the ETL process.
- Mapped the source and target databases by studying the specifications and analyzing the required transforms.
- DS Director was used to validate, schedule and run production jobs.
- Performed database migration and integration from the source server and prepared data mapping
- Created custom component to extract identical data from different servers of the same organization.
Environment: IBM Information Server 9.1, Data stage ETL (Director, Designer, Administrator) 11.7, Hadoop Hue- 3 and 4, Oracle 11g & 12.9, Data Mapping, Unix, NDM, SQL Server 2014, Agile Development, Toad, Visio, Putty, Flat Files, SQL Scripting, Mainframes, Rally, Jira, Autosys, WinSCP, PowerBroker.
Confidential - NJ
ETL Designer/Data Stage Developer
- Designed jobs to load data into staging tables after performing required transformations depending on the business rules provided by the client.
- Worked with quality stage team to standardize and format the data.
- Deployed Data Stage jobs to different environments; development environment to test environment and from test environment to production environment.
- Worked with CICS, TSO, COBOL, VSAM, CA Easytrieve/Plus, Insync, Apptune, Endevor, DumpMaster, JCL, batch and online programming.
- Used different partitioning methods in different parallel stages to minimize the cost of the jobs.
- Extracted data from different sources like Oracle tables, SQL Server tables and flat files.
- Populated the data into Dimension Tables, Fact tables and Staging tables.
- Used multiple stages like Change Data Capture, Sequential File, Transformer, Aggregator, Join, Lookup, Sort and Filter during ETL development.
- Implemented the error handling logic to report the data errors to the Data Owners.
- Designed and used shell scripts that automate the data stage jobs and validate files.
- Worked closely with data modeler and database administrator for finalizing the data warehouse model.
- Designed shared containers for designing the reusable components in the Project.
- Used native database stages when connecting to databases which were one of our performance tuning procedures.
Environment: IBM WebSphere 8.0, Visio, Oracle 10g/9i, SQL server, UNIX, Windows XP, Toad, Mainframe’s.
Confidential - MI
Data Stage Developer
- Understood the specifications for Data Warehouse ETL Processes by interacting with the client and end users for gathering business requirements.
- Designed the specifications and analyzed the data for business requirements and developed meta data mappings and detailed design documents.
- Designed, developed and documented the ETL (Extract, Transformation and Load) strategy to populate the Data Warehouse from various source systems (Teradata) feeds using Data Stage, and Unix Shell scripts.
- Used different Parallel job stages like Join, Merge, Lookup, Filter, Col and Row generator, Transformer, Modify, Aggregator, Remove Dup, Teradata Enterprise Stage, Teradata Load Stage, Teradata multiload /Tpump/Fast Export stages.
- Used Teradata utilities like Fast Load, Multi load, Pt Api Interface, BTEQ to efficiently handle data.
- Created Master sequencers using User Variable Activity, Job Activity, Execute Command Stage, Start Loop Activity, End Loop Activity, Nested Condition, Exception Handler Stages.
- Used Data Stage Director and the runtime engine to schedule running the server jobs, monitoring and debugging its components.
- Worked on Data Stage admin activities like creating ODBC connections to various Data sources, Server Start up and shut down, Creating Environmental Variables, Created Data Stage projects and Message Handler files, Dsparams File and JCL
- Designed and created weekly and monthly spending reports in financial process.
Environment: IBM Information Server 9.1, IBM Information Analyzer 9.1, DataStage 9.1 (Designer, Director and Administrator), Oracle 10g/11g, DB2, SQL, PL/SQL, SQL* Loader, SAP interface, Cobol flat files, UNIX Shell Script, TOAD 7.3, ERWIN 4.2, Win XP and AIX UNIX 4.2, Autosys.
Data Stage Developer
- Analyzed, conceptualized/designed the database that serves the purpose of proving critical business metrics.
- Developed ETL procedures to ensure conformity, compliance with standards and lack of redundancy, translates business rules and functionality requirements into ETL procedures using Informatica -PowerMart.
- Worked with ERwin tool in Data Modeling (both Physical and Logical Design).
- Developed and documented data Mappings/Transformations, Audit procedures and Informatica sessions.
- Assisted in the design and Maintenance of the Metadata environment.
- Developed and tested all the backend programs, Informatica mappings and update processes.
- Effectively managed the migration of the transformations/mappings from development to Production.
- Developed various bulk load and update procedures and processes using SQL * Loader and PL/SQL.
- Managed and programmed large systems in Cobol II, VSAM, ISPF, TSO, CICS, BMS, FileAid, and JCL on an IBM mainframe.
- Error checked and tested the ETL procedures and programs of Informatica session log.
Environment: Ascential Datastage7.5 (Manager, Designer, Director, Administrator), Oracle9i, SQL, PL/SQL, SQL * Loader, UNIX, Mainframe, Cobol flat files.
- Created the database tables, indexes, constraints in Oracle8i.
- Tuned the oracle database queries using indexes, explain plan and optimizer hints.
- Developed packages, stored procedures, functions, triggers, views and cursors using PL/SQL.
Environment: Oracle 8i, SQL, PL/SQL, UNIX, Windows NT.
- Provided back-end support for customized forms like customer and supplier maintenance, bill, receipt, letter of credit, invoice, debit and credit notes.
- Configured web.xml and struts-config.xml according to the struts framework.
- Involved in daily, weekly and Monthly load of data from Mainframe data set to Oracle database.
- Developed PL/SQL packages to monitor revenue earnings and expenses and to maintain the general ledger of the organization.
- Wrote Database objects like Triggers, Stored procedures in SQL, PL/SQL.
- Wrote database triggers for audit and data validation.
- Designed the entity-relationship model and normalized database.
Environment: Oracle 8i, SQL, PL/SQL, UNIX and Windows NT.