Datastage Lead Developer / Sr. Sap Business Objects Developer Resume
Middletown, NJ
SUMMARY:
- Over 9+ years of progressive IT experience with a background in Data Integration (ETL), Data Storage (DW/ODS), data delivery business intelligence(BI) reporting and analytics with full life - cycle data warehouse and BI development & Production support.
- Database experience: Worked with various Databases like Oracle 10g/11g, Teradata v14/15, Vertica Analytic Database v8/9.
- Effectively made use of Database triggers, cursors, materialized views, In-line views, projections, Packages, Stored procedures, Analytical functions, Indexes, Partitioning, and database constraints.
- Expertise in SQL Loader, SQL*PLUS, PL/SQL and Oracle DBMS packages.
- Worked on IBM Infosphere DataStage versions 8 & 11 components like Designer, Director and Administrator.
- Strong understanding of the principles of Data warehousing using fact tables, Dimension tables and star/snowflake schema modelling.
- Experienced in scheduling sequence and parallel jobs using DataStage Director, UNIX scripts and scheduling tools.
- Experience in working in a Multi-Dimensional Warehouse projects.
- Experience in reading and loading high-volume Type 2 dimensions by implementing SCD (Slowly Changing Dimensions).
- Extensive knowledge of writing complex queries using SQL.
- Experience in using various database interaction utilities like TOAD, Teradata SQL Assistant, Dbeaver.
- Knowledge in using Erwin as leading Data Modeling tool for logical (LDM) and physical data model (PDM).
- Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files. Strong hands on experience using Teradata Parallel Transport (TPT) scripting.
- Utilized SAP Business Object for reporting and analysis of data. Designed and developed BO reports/universes accessing Vertica database.
- Strong understanding of Universe Design, Development, Alteration, maintenance using Universe Design Tool (UDT).
- Developed Web Intelligence (WEBI) reports with features like Rank, prompts, Drill, hyperlink, alerts, custom SQL, merged dimensions, formulas using context operators.
- Ability to lead a group, take initiatives, make decisions, work and inspire a team to meet deadlines, handle crises as part of team management skills.
TECHNICAL SKILLS:
Tools: TOAD, Teradata SQL Assistant, Dbeaver, SQL Server Management Studio 2008, Teradata Manager, PMON, Putty, IBM Tivoli TWS, Cron tab, Teradata Viewpoint
Data Integration: IBM DataStage Versions 8.5,8.7,11.3
RDBMS: Oracle 10g/11, MS SQL Server, Teradata v14/v15
Data Modelling: Erwin
Unix shell scripting: Oracle SQL, PL/SQL Teradata Macros, BTEQ, MLOAD, FASTLOAD, FAST EXPORT, Teradata Parallel Transporter, TASM
Others: BI SAP Business Objects 4.2, Universe Designer, Web Intelligence (WEBI), Agile
EXPERIENCE:
Confidential, Middletown, NJ
DataStage Lead Developer / Sr. SAP Business Objects Developer
Responsibilities:-
- Leading a team of ETL developers and Business Objects developers and coordinating with client team to gather the new requirements and working with existing issues.
- Attended meetings with client and business teams to understand the requirement and prepared the low level design document, technical specification documents.
- Identified the impacts and created Data Mapping Sheets.
- Active participation in decision making and QA meetings and regularly interacted with the Business Analysts & development team to gain a better understanding of the Business Process, Requirements & Design.
- Used PL/SQL, Oracle as an ETL tool to extract data from sources systems, loaded the data into the Oracle database.
- Designed and Developed Procedures, Packages to extract data from Feed files and applied transform logics to extracted data and Loaded into Datamart.
- Converted complex job designs to different job segments and executed through job sequencer for better performance and easy maintenance.
- Created Oracle scheduler Chains for running Packages in sequence.
- Coordinated with team members and administer all onsite and offshore work packages.
- Performed performance tuning of the jobs by interpreting performance statistics of the jobs developed.
- Used DataStage as an ETL tool to extract data from sources systems, loaded the data into the Oracle database.
- Designed and Developed Data stage Jobs to Extracted data from Feed files and Oracle as a sources applied transform logics to extracted data and Loaded into Datamart.
- Created DataStage jobs using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify, Filter, Change Data Capture, Change Apply, Column Generator, Difference, Row Generator, Sequencer, Email Communication activity, Command activity, Sequential File, CFF stage, Dataset, Terminator activity.
- Used Data Stage Director and its run-time engine for job monitoring, testing and debugging its components, and monitoring the resulting executable versions on ad hoc or scheduled basis.
- Converted complex job designs to different job segments and executed through job sequencer for better performance and easy maintenance.
- Created master jobs sequencers.
- Performed operations for ETL, Statistic reports and Analysis using SQL Server integration tool.
- Coded shell scripts for creating files from DB using .sql scripts, formatting the incoming/outgoing files, loading the files to DB tables which are used for DataStage.
- Provided production support for Datastage applications and Business Objects Reports/Universes.
- Used the principles of normalization to improve performance. Involved in ETL code using PL/SQL and SQL*Loader calling UNIX scripts to download and manipulate the files.
- Performed SQL and PL/SQL tuning using EXPLAIN PLAN, SQL *TRACE and AUTOTRACE.
- Extensively used bulk collections in PL/SQL objects for better performance.
- Extensively used the advanced features of PL/SQL like Records, Tables, Object types and Dynamic SQL.
- Creating Universe and Developing Web Intelligence (WEBI) Reports with features like Rank, prompts, Drill, hyperlink, alerts, custom SQL, merged dimensions, formulas using context operators.
- Created Teradata macros and implemented various Teradata analytic functions.
- Good knowledge on Teradata manager, TDWM, PMON, BTEQ.
- Maintaining DR (Disaster Recovery) servers in sync with the active servers.
- Assigning work within the team. Maintaining and tracking issues to resolution.
- Creating and maintaining weekly status on all current projects
Environment: DataStage 11.3,Toad, SAP Business Objects 4.2, Oracle 10g/11g, Unix, Vertica, Toad for Oracle, Unix Shell scripting, Teradata, Vertica, Erwin
Confidential
Teradata Lead Developer / Sr. PL/SQL Developer
Responsibilities:-
- Involved in High Level Design discussions with Client and Application Design creation for the same
- Responsible for solving data related issues and verification of data.
- All file paths must be parameterized to allow runtime flexibility and ease of migration between various server environments.
- Designed/wrote the tech specs (Source-Target mappings) for the ETL mappings along with the Unit Test scripts.
- Used partitioning and parallelism techniques to increase the performance in DataStage.
- Development of component solution with end to end perspective.
- Developed the Teradata Macros, Stored Procedures to load data into Incremental/Staging tables and then move data from staging into Base tables
- Reviewed the SQL for missing joins & join constraints, data format issues, miss-matched aliases, casting errors.
- Responsible for Design, Data Mapping Analysis, Mapping rules .
- Responsible for Development, Coding & testing.
- Responsible for Implementation & Post Implementation support.
- Extensively used loader utilities to load flat files into Teradata RDBMS.
- Used BTEQ and Teradata SQL Assistant front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
- Performed high volume maintenance on large Teradata tables using MultiLoad loader utility.
- Created TPT to transfer the data from flat files to Teradata.
- Used Fast Export utility to extract large volumes of data Confidential high speed from Teradata RDBMS.
- Developed TPump scripts to load low volume data into Teradata RDBMS Confidential near real-time.
- Collected statistics periodically on tables to improve system performance.
- Performed tuning and optimization of application SQL using Query analyzing tools.
- Worked on SQL*Loader to load data from flat files into Oracle obtained from various facilities every day. Used standard packages UTL FILE, DBMS SQL and PL/SQL collections and used BULK Binding in writing database procedures, functions and packages for front end module.
- Used Oracle scheduler and its run-time engine for job monitoring, testing and debugging its components, and monitoring the resulting executable versions on ad hoc or scheduled basis.
- Used file-manager xml for pulling the files from different sources and formatting the files for ETL purposes.
- Wrote Unix shell scripts to process the files on daily basis like renaming, unzipping and remove junk characters before loading them into staging tables.
- Used Clear case SVN version control tool for version controlling and movement of code to upper environments like SIT, UAT, Pre production and Production.
- Prepared DDL’s for table creation, table modification, index changes. Tested and executed the same in all environments Dev, CIT, SIT, UAT, Pre production and Production.
- Prepared the DML’s for maintenance tables, reviewed, tested and executed them in Oracle.
- Used DataStage as an ETL tool to extract data from sources systems, loaded the data into the Oracle database.
- Designed and Developed Data stage Jobs to Extracted data from Feed files and Oracle as a sources applied transform logics to extracted data and Loaded into Datamart.
- Created Datastage jobs using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify, Filter, Change Data Capture, Change Apply, Column Generator, Difference, Row Generator, Sequencer, Email Communication activity, Command activity, Sequential File, CFF stage, Dataset, Terminator activity.
- Used Data Stage Director and its run-time engine for job monitoring, testing and debugging its components, and monitoring the resulting executable versions on ad hoc or scheduled basis.
- Converted complex job designs to different job segments and executed through job sequencer for better performance and easy maintenance.
- Created master jobs sequencers.
- Created shell script to run data stage jobs from UNIX and Tivoli and then schedule this script to run data stage jobs through Crontab scheduler, Tivoli Scheduler.
- Coordinated with team members and administer all onsite and offshore work packages.
- Performed performance tuning of the jobs by interpreting performance statistics of the jobs developed.
- Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.
- Component representation in user acceptance testing calls.
- Defect and delivery management.
- E2E release activities.
- Providing knowledge Transfer (KT) to the new recruits and improve their skills.
- Ensure best practices and standards of DWH solutions
- Conduct and lead white-boarding sessions, workshops, design sessions, and project meetings as needed, playing a key role in client relations
- Ensure project deliverables in time and within budget during each project releases
Environment: DataStage 8.7,Teradata V2R14, Teradata SQL Assistant, MLOAD, FASTLOAD, BTEQ, TPUMP, Erwin, Unix Scripting, MACROS, Teradata parallel Transporter TPT, Toad for oracle, PLSQL, k-shell scripts, Oracle
Confidential
Sr. Datastage Developer / Data Modeler
Responsibilities:
- Extensively involving in meetings with Business team and analyst to gather the requirements. Understanding the functionality of the system & understanding business rules.
- Created Source to Landing, Landing to Target mapping documents to load the data from Source to Teradata target area with Business Rules applied.
- Designed and implemented the Business requirements through Mappings in DataStage, TPT scripts.
- Played a vital role in estimating the work efforts required to complete the activities under different modules
- Designed and Developed DataStage Jobs to Extracted data from Feed files and Oracle as a source applied transform logics to extracted data and Loaded into Datamart.
- Created Datastage jobs using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify, Filter, Change Data Capture, Change Apply, Column Generator, Difference, Row Generator, Sequencer, Email Communication activity, Command activity, Sequential File, Dataset, Terminator activity.
- Used Data Stage Director and its run-time engine for job monitoring, testing, and debugging its components, and monitoring the resulting executable versions on ad hoc or scheduled basis.
- Created master jobs sequencers.
- Experience with Scheduling tool Tivoli work scheduler for automating the ETL process
- Created physical and Logical models using Erwin to effectively translate conceptual model into logical and physical models conforming to the business and data requirements.
- Monitoring and controlling the system using Teradata Administrator and Teradata Manager Tools
- Creating and modifying MULTI LOADS for Teradata parallel transporter using UNIX and Loading data into DW.
- Loading data from various data source systems into Teradata warehouse using BTEQ, FASTEXPORT, MULTI LOAD, FASTLOAD and TPT.
- Expertise in Teradata cost based query optimizer, identified potential bottlenecks with queries from the aspects of query writing, skewed redistributions, join order, optimizer statistics, physical design considerations (PI and USI and NUSI and JI) etc. In-depth knowledge of Teradata Explain and Visual Explain to analyze and improve query performance.
- Wrote complex SQL queries using joins, sub queries and inline views to retrieve data from the database.
- Created shell scripts to automate data feed pull from remote FTP servers and imported the data through SQL Loader.
- Improved database performance tuning by SQL statements, creating materialized views, index tuning and data warehousing.
- Troubleshooting Vertica query performance and Improved response time to queries up to 50 percent by reading the query plans, redesigning the projections and analyzing system tables.
- Created database objects like stored procedures, function, packages, Cursor, Ref Cursor and triggers
- Created and used XML for data management and transmission
- Involved in Unit Testing, Integration testing and UAT Performance Testing.
- Assess and recommend data platform governance as data quality, metadata and performance management.
- Define & drive adoption of DW standards, processes and best practices
Environment: IBM Infosphere DataStage 8.7, Tivoli work scheduler, k shell scripts, Toad for Oracle, Erwin, Oracle 10g,Teradata
Confidential
Software Engineer / SQL Developer
Responsibilities:
- Performed the Creation, manipulation and supporting the Oracle and Teradata databases.
- Understanding the specifications and the Business requirements from client perspective.
- Designed the understanding documents and technical specification documents after thoroughly understanding the Functional specifications from Business team and product management.
- Extensively worked on query optimization and performance tuning.
- Created the integrity constraints and Database Triggers for the data validations.
- Created relevant staging tables to load the CSV files, identified the business validation rules and Created SQL*Loader script along with UNIX shell scripting and PL/SQL.
- Participated in working with IMPLICIT,EXPLICIT and REF cursors.
- Created and manipulated stored procedures, functions, packages and triggers using TOAD. Wrote heavy stored procedures using dynamic SQL to populate data into temporary tables from fact and dimensional tables for analytic purpose.
- Experience in using various oracle pl/sql collections varrays, nested table, associative arrays with index by varchar2.
- Involved in creating various utility scripts to generate log files in UNIX using shell scripts.
- Tuned several Oracle sql statements using Explain Plan, Auto Trace utilities.
- Involved in implementing table partitions using Range, Hash, Composite techniques.
- Worked on external hints during the sql optimization to fast up the sql process.
- Experience in working a variety of analytical functions, RANK, DENSE, RANK, LAG, LEAD, PARTITION OVER.
- Designed & Developed logical and physical data models for star and snowflake schemas using Erwin.
- Implemented Automatic primary key generation using Oracle’s sequence
- Developed PL/SQL such as complex query writing and stored procedures for databases Oracle
- Created scheduler Chains for running Packages in sequence
- Analyze and resolve the data quality issues raised by the users.
- Performed Unit Testing, Data Integration Testing and User Acceptance Testing (UAT) for every code change and enhancement.
- Database security by authenticating users, using logon triggers. Created materialized view on remote database and automated scheduler of refreshing of materialized view
- Prepared BTEQ scripts to apply all the transformations and business rules on the Staging area in Teradata to load Core Tables.
- Performed Unit Testing, Data Integration Testing and User Acceptance Testing (UAT) for every code change and enhancement
- Developed various jobs and also loaded the data into the warehouse by analyzing and resolving the defects if any in the applications as a part of warranty support activity to the Production & Support team
Environment: Toad for Oracle, Erwin, Oracle 10g, b shell scripts, WinSCP, Teradata