We provide IT Staff Augmentation Services!

Etl Architect Resume

5.00/5 (Submit Your Rating)

Chicago-iL

SUMMARY

  • Over 10 years of IT experience in implementation and development of Data Warehouse/Datamart, data integration.
  • 10 years of Extraction, Transformation and Loading (ETL) experience using IBM Information Server 8.0.1/DataStage 7.5.2/7.0/6. x/5.x (Administrator, Director, Manager, Designer), MetaStage, Parallel Extender, Integrity, OLAP and OLTP
  • Expertise in Data Warehousing/ETL programming and fulfillment of data warehouse project tasks such as data extraction, cleansing, transforming and loading
  • Worked with Local Containers, Shared Containers, and Job Sequencers
  • Conversant with writing Basic Expressions, External Transformers and functions
  • Expertise in dimensional data modeling, Star schema modeling, Snow - Flake modeling, identification of fact and dimension tables, Normalization, Physical and Logical data Modeling using Erwin
  • Profound Knowledge of Microsoft SQL Server 2005/2000/7.0 /6.5, Oracle 10g/9i/8i, SQL, PL/SQL, SQL* Loader, Windows NT 4.0
  • Good understanding of Business Objects OLAP tool.
  • Extensive experience in client-server and internet application development using Oracle, MS SQL Server, PL/SQL, Stored procedures, Triggers, JDBC, ODBC and Visual Basic
  • Exposure in Conceptual, Logical and Physical data modelling
  • Excellent analytical, interpersonal and communication skills

TECHNICAL SKILLS

Operating Systems: Windows XP/ NT/2000, UNIX, Red Hat Linux, Sun Solaris, DOS

ETL Tools: IBM Information Server 8.5/8.0.1/7.5. x/7.0/6.x/5.2Parallel Extender, SSIS

RDBMS: Oracle 10g/9i/8i, MS Access 2003, DB2 UDBMS SQL Server 2005/2000/7.0, Teradata v12

Languages: JAVA, XML, XSL, PL-SQL, C, C++

Web Technologies: HTML, Dream Weaver

Protocols: TCP/IP, FTP, HTTP

PROFESSIONAL EXPERIENCE

Confidential, Chicago-IL

ETL Architect

Responsibilities:

  • Designed & Developed ETL architecture in accordance with business requirements
  • Involved in converting Conceptual Model to Logical and Physical Models
  • Provided an architecture solution that is reusable, maintainable and scalable
  • Worked on extracting data from different Schemas on National Data warehouse(NDW) and load the data into DB2 Staging Database to build extracts
  • Created DataStage jobs and tuned them for better performance
  • Developed Shell scripts for running DataStage Jobs.
  • Developed generic UNIX shell scripts to run DataStage Jobs, data loading procedures, file control and job control techniques.
  • Participated in requirements gathering and refining discussions.
  • Participated in design review sessions.
  • Delivered the data for vendors based on the requirement specifications.
  • Written ETL Design and Functional Design Specification documents.
  • Created Tivoli processes for scheduling and running DataStage jobs.
  • Written ETL Design and Functional Design Specification documents.
  • Maintained documentations for error recovery procedures

Environment: IBM Information Server 8.5, DB2 V9.7, UNIX AIX 4.2, IBM Tivoli, UNIX Scripting KSH, MS Visio

Confidential, Chicago-IL

Sr. ETL Consultant

Responsibilities:

  • Involved in Source System Analysis, extracting and analyzing the data for ODS, SDW
  • Involved in converting Conceptual Model to Logical and Physical Models
  • Extensively worked on Parallel and Server Jobs.
  • Worked on extracting data from different sources and load the data into DB2 Database
  • Implemented local containers for the same jobs and shared containers for multiple jobs, which have the same business logic
  • Created DataStage jobs, batches and job sequences and tuned them for better performance
  • Extensively worked with Parallel Extender for parallel processing to improve job performance while working with bulk data sources
  • Used look up stage, join stage, merge stage, funnel stage and filter stage to perform the transformation on source data
  • Developed Shell scripts for running DataStage Jobs.
  • Developed generic UNIX shell scripts to run DataStage Jobs, data loading procedures, file control and job control techniques.
  • Performed validation and execution of jobs using DataStage Director
  • Used DataStage to perform ETL operations for Operational Data Store (ODS) needs as wells as auditing reasons.
  • Worked on building Strategic Data Warehouse(SDW) for Analytics and reporting and implemented Slowly Changing Dimensions (SCD 2)
  • Delivered the data for different vendors based on the requirement specifications.
  • Written ETL Design and Functional Design Specification documents.
  • Created Zena processes for scheduling and running DataStage jobs.
  • Written ETL Design and Functional Design Specification documents.

Environment: IBM Information Server 8.5/8.1, DB2 V9, UNIX AIX 4.2, ASG-Zena, UNIX Scripting KSH, MS Visio, Teradata 13.1

Confidential, Princeton, NJ

Sr. ETL Developer

Responsibilities:

  • Involved in the design, development, implementation and maintenance of DataStage jobs
  • Supported the maintenance of various Projects and handled production issues/exceptions.
  • Involved in extraction, transformation and loading of the data using Built-in, plug-in and custom stages
  • Implemented local containers for the same jobs and shared containers for multiple jobs, which have the same business logic
  • Created DataStage jobs, batches and job sequences and tuned them for better performance
  • Extensively worked with Parallel Extender for parallel processing to improve job performance while working with bulk data sources
  • Used look up stage, join stage, merge stage, funnel stage and filter stage to perform the transformation on source data
  • Developed Shell scripts for running DataStage Jobs.
  • Developed generic UNIX shell scripts to run DataStage Jobs, data loading procedures, file control and job control techniques.
  • Performed validation and execution of jobs using DataStage Director
  • Used DataStage to perform ETL operations for Master Data Management (MDM) needs as wells as auditing reasons.
  • Delivered the data for different vendors based on the requirement specifications.
  • Written ETL Design and Functional Design Specification documents.
  • Created Autosys Jils for scheduling and running DataStage jobs.

Environment: IBM Information Server 8.0.1, Oracle 10g, UNIX AIX 4.2, Autosys, UNIX Scripting KSH, MS Visio

Confidential, Charlotte, NC

Sr. DataStage Developer

Responsibilities:

  • Interacted with finance domain experts to define business requirements and design new strategies for repeatable processes.
  • Perform design and code reviews for the projects to ensure alignment with best practices and standards
  • Development of ETL Technical Specification (Mapping Documents) document.
  • Developed ETL design, development, implementation standards and procedures based on industry best practices
  • Developed DataStage Parallel (PX) jobs using Lookup, JOIN, Copy, Aggregator, surrogate key, Column Generator, Dataset, Transformer, Lookup Fileset, Filter, Sort, merge, dataset and Funnel Stages.
  • Used Parallel Extender for splitting the data into subsets and flowing of data concurrently across all available processors to achieve job performance.
  • Created Oracle PL/SQL Package functions for performance tuning of database queries.
  • Developed generic UNIX shell scripts to run DataStage Jobs, data loading procedures, file control and job control techniques.
  • Performed Unit/Integration testing on ETL Job as per Test Cases documents, ensure standard compliance and validate that business goals are accomplished.
  • Created Oracle Package functions for performance tuning of database queries.
  • Supported Production On call for newly migrated code and already migrated code
  • Used Borland Starteam as Version Control tool, to check in/out the code.
  • Created Autosys Jils for scheduling and running DataStage jobs.
  • Created Visio Process flow to understand the process and for maintenance purpose

Environment: IBM Information Server 8.0.1 PX, Oracle 10g, UNIX AIX 4.2, Autosys, UNIX Scripting KSH, Borland Starteam, MS Visio

Confidential, Princeton, NJ

Sr. ETL Developer

Responsibilities:

  • Involved in the design, development and implementation of DataStage jobs
  • Implemented various strategies for slowly changing dimensions
  • Created fact tables and dimension tables based on the warehouse design
  • Involved in extraction, transformation and loading of the data using Built-in, plug-in and custom stages
  • Implemented local containers for the same jobs and shared containers for multiple jobs, which have the same business logic
  • Created DataStage jobs, batches and job sequences and tuned them for better performance
  • Extensively worked with Parallel Extender for parallel processing to improve job performance while working with bulk data sources
  • Used look up stage, join stage, merge stage, funnel stage and filter stage to perform the transformation on source data
  • Imported metadata, routines and table definitions using DataStage Manager.
  • Developed Shell scripts for running DataStage Jobs.
  • Extract Transform Load (ETL) development using SQL Server 2005 Integration Services (SSIS)
  • Performed validation and execution of jobs using DataStage Director
  • Used SSIS to load data cleansing activity and to maintain the quality of the data
  • Used SSIS to perform ETL operations for Master Data Management (MDM) needs as wells as auditing reasons.
  • Used Microsoft Server Agent for scheduling the SSIS packages.
  • Delivered the data for different vendors based on the requirement specifications.
  • Written ETL Design and Functional Design Specification documents.
  • Created Autosys Jils for scheduling and running DataStage jobs.

Environment: IBM Information Server 8.0.1(Parallel Extender), SQL Server Integration Services (SSIS), Oracle10g, MS SQL Server 2005, UNIX AIX 4.2,Windows NT, MS Visio, Siebel AdvantEDGE 7.7,MS Access 2003, Autosys

Confidential, Minneapolis-MN

Sr. DataStage Developer

Responsibilities:

  • Involved in System analysis and design of Enterprise Data warehouse (EDW).
  • Defined best practices for source system analysis and source data profiling for Master Data Management.
  • Involved in creating Functional and scope documents for ETL process
  • Involved in developing Data mart for Reporting applications using Data Stage
  • Developed Datamarts for different customers with their requirements and developed a process to capture the statistics of all the tables to check the integrity like Min,Max,Mean,Sum and Median from Source to Target (i.e. SOR(System of record to Datamart tables)
  • Designed, developed and tested the DataStage jobs for extract, transform and load.
  • Developed various jobs using stages DB2 enterprise, Lookup, Dataset, Join, Sort, Aggregator, Filter, Modify, Funnel, Peek (debugging stage), Remove Duplicate and Copy
  • Extensively worked on writing DataStage Build-Ops for various jobs based on the requirements and transformation rules
  • Extensively worked on Type 2 slowly changing dimensions by generating surrogate keys and maintaining referential integrity with out hurting the speed of the surrogate key replacement.
  • Involved in performance tuning of the Data Stage jobs and queries
  • Extensively worked with Parallel Extender for parallel processing to improve job performance while working with bulk data sources
  • Developed Parameterized reusable DataStage jobs where you can use these jobs in multiple instances
  • Developed DataStage Jobs to load the data into Teradata tables using Fast load utility
  • Involved in Extracting, transforming, loading and testing data from Mainframe files, XML files, Flat files and DB2 using DataStage jobs
  • Used DataStage Manager for importing metadata from repository, new job categories and creating new data elements
  • Developed Shell scripts for running DataStage Jobs.
  • Used Clearcase as Version Control tool, to check in/out the code.
  • Used Clear quest defect tracking system.
  • Performed the Unit testing for jobs developed to ensure that it meets the requirements
  • Created Autosys Jils for scheduling and running DataStage jobs.
  • Created Visio Process flow to understand the process and for maintenance purpose
  • Used PAC2000 V 7.1 tool for creating Work Orders for production installs and Problem Tickets for Production issues
  • Supported Production On call for a week in every two months for newly installing code and already installed code
  • Supported the Delivery Services Team while migrating the Servers by giving instructions.
  • Used DataStage Manager for importing metadata from repository, new job categories and creating new data elements.

Environment: IBM WebSphere DataStage 7.5.2/7.5.1 PX (Enterprise Edition) DB2 V 8.1/9.1, Teradata v12, UNIX AIX 4.2, Autosys, UNIX Scripting KSH, Windows NT, Clearcase, Clearquest, PAC2000 V 7.1

Confidential, McLean, VA

DataStage Developer

Responsibilities:

  • Involved in gathering requirements and ETL architecture
  • Involved in creating Functional and scope documents for ETL process
  • Designed, developed and tested the DataStage jobs for extract, transform and load.
  • Developed various jobs using stages DB2 enterprise, lookup, Dataset, Join, Sort, Transformer, Aggregator, Filter, Modify, Funnel, Peek (debugging stage), Remove Duplicate and Copy.
  • Involved in performance tuning of the Data Stage jobs and queries
  • Extensively worked with Parallel Extender for parallel processing to improve job performance while working with bulk data sources
  • Written SQL in DB2 for using in DataStage and testing the data.
  • Involved in Extracting, transforming, loading and testing data from SAS Datasets files, Flat files and DB2 using DataStage jobs
  • Used DataStage Manager for importing metadata from repository, new job categories and creating new data elements
  • Used Both Pipeline Parallelism and Partition Parallelism for improving performance.
  • Extensively used Data stage Director for Job Scheduling, emailing production support for troubleshooting from LOG files.
  • Involved in performance tuning of the ETL process and performed the data warehouse testing
  • Extensively worked on Error handling, cleansing of data, Creating Hash files and performing lookups for faster access of data.
  • Developed jobs in Parallel Extender using different stages like Transformation, Aggregation, Source dataset, external filter, Row generation, Column generation and vector stage.
  • Extensively used DataStage Designer to develop processes for extracting, transforming, integrating and loading data from various sources into the Data Warehouse.
  • Debugging its components, and monitoring the resulting executable versions
  • Used DataStage Manager for importing metadata from repository, new job categories and creating new data elements.
  • Defined UNIX shell scripts for cron tab jobs and file archiving process.
  • Worked on Kenneth Blands architecture for running and scheduling
  • Involved in system analysis & design of the data warehouse.

Environment: IBM Websphere DataStage Enterprise Parallel Extender 7.5.2 /7.1, UNIX Shell Scripting, DB2, Rational Clearcase, Autosys, Visio, Lotus Notes.

Confidential

Software Programmer

Responsibilities:

  • Performed Automation Testing using Win Runner and Load Runner, including Functionality, which was developed in Ascential DataStage..
  • Created Test Plan and detailed Test Procedures for the application.
  • Conducted data integrity testing by extensive use of SQL.
  • Conducted System testing and Regression testing.
  • Investigated, noted and collated information on software bugs and reported the same to developers.

Environment: Win Runner, Load Runner, Test Director, Ascential DataStage 5.2, Oracle 8.0, MS-Office, Windows NT 4.0

We'd love your feedback!