We provide IT Staff Augmentation Services!

Sr. Datastage Developer Resume

4.00/5 (Submit Your Rating)

Cincinnati, OH

SUMMARY:

  • 8 years of experience in IT industry especially in System Analysis, Design, Development, Testing implementation and Production Support in the fields of Enterprise Data Warehouse (EDW) using IBM Infosphere Information Server Datastage Suite ( 11.5/9.1/8.5 ), BIG DATA Hadoop, Teradata, Oracle, PL/SQL, DB2, SSIS, SQL, procedural language, Unix Shell Scripting, Linux Administration and Power BI reporting tools using SAP Business Intelligence 3.1 and 4.0, IBM Cognos Reporting and Analytics, SSRS, Tableau (self Learning). Industry experience in Banking & Financial Services which includes the following applications
  • Merchant and Investment Banking
  • Human Resources and IT Management
  • Revenue Management and Billing (RMB)
  • Mercury Billing and partner payments
  • Payment Datawarehousing and Analytics (PDW)
  • Finance Data program (FDP)
  • Data Networking and Analytics
  • Experience in all the phases of the Data warehouse life cycle involving Data analysis, design and development and testing using ETL, Data Modeling, Online Analytical Processing & reporting tools.
  • Real Time and hands on Experience on data integration experience in developing ETL jobs using Datastage 11.5/ 9.1/8.5/7. x (Designer, Director, Administrator, Web Console, Operations Console, Information Analyzer, Information Governance Catalogue) and developing Parallel jobs using various Datastage stages such as Lookup, Join, Merge, Transformer, Aggregator, Funnel, Remove Duplicates, Oracle Enterprise, DB2, Sequential file, Hive, Impala
  • Extensively used Parallel Stages like Row Generator, Column Generator, Head, and Peek for development and de - bugging purposes
  • Hands on Experience in Data Governance Catalogue (Integrated Data Dictionary, Stewardship, Lineage, Rules and policies, End to end Information blue print, Metadata Asset Management)
  • Expertise in Relational Database Modeling and Dimensional Modeling - Star schema, Snow-Flake schema, and hands on experience in data modeling tools Erwin / Microsoft Visio
  • Hands on Experience on Big Integrate Hadoop using Spark, Hive, Impala connector stages to integrate the Big Data into Datastage.
  • Thorough understanding of RDBMS like Oracle, DB2 and SQL Server and extensively worked on data Integration using Datastage for the Extraction transformation and loading of data from various source systems such as Mainframes, SAS, Direct, FICO, Confidential IQ, MDBS.
  • Experience in data warehousing concepts, ETL programming using Datastage Parallel Extender analysis, cleansing, transforming, debugging/testing, and data loading across source systems. Optimized Datastage jobs utilizing parallelism: Partitioning and Pipelining features of Parallel Extender
  • Experience in migrating ETL code to upgraded version from 8.5 to V 11.5 from Architecture design to Promoting jobs from Test to PROD and DR environments ( jobs using Clear case Version Controlling tool /batch Scripts) and troubleshooting all the technical issues related to Migration and documenting all the scenarios.
  • Experience in handling Data Vault (DV version 2.0) methodology too. Created HUBS, LINKS and SATELLITES and involved in design in Specific for the Confidential Billing Systems for all FI’s and Merchants.
  • Experience in executing commands using ISA Lite tool to gather trace files, Log files, Validate installation requirements and System health check.
  • Experience in gathering requirements, preparing functional/technical specifications documents and interacting with the users.
  • Experience in working on Teradata and its utilities such as Fastload, MultiLoad, BTEQ, MultiLoad, TPT and Management of Data blocks, AMPS, Parsing Engine and Nodes, Hashing with Unique and Non-Unique Primary Key Index, Distribution and Tactical queries, Partitioning etc
  • Has worked extensively with SQL & PL/SQL, including performing query optimizations.
  • Experience in writing UNIX scripts, Basic routines.
  • Experience in developing complex Jobs, Sequences and re-usable Parameter sets/jobs/containers.
  • Expert knowledge of performance tuning source, mapping, target and sessions.
  • Excellent in Documentation, Knowledge Transfer and Production Support.
  • As part of Production Support was involved on Supporting and Monitoring the jobs, debugging the errors for the batch jobs and carrying the required work around/Permanent fix to ensure Continuity without any disruption within the predefined SLAs. Administrative activities like maintaining jobs logs, deploying jobs into PROD and DR Environments and setting environment variables, Stop/Start of the servers, Engine Patching, assigning data stage user roles using web console and administrator, creating the projects, configuring project, handling trace files and configuration file, unlock the ETL jobs (using Commands), etc.
  • Worked as Confidential onsite coordinator with responsibilities such as providing solutions for the issues/troubles/technical difficulties and assist 6-member offshore team as needed and Knowledge transfer on top of design documents, source to target mappings, Business Flow and document as Run books
  • Excellent Knowledge on cleaning up the bad/Duplicated Data in FACT, ODS layer during Production Support
  • Experienced in 24x7 on-call rotation to support the production issues.
  • Experience in Scheduling Datastage jobs using IBM Tivoli Work load scheduler, Control-M and Autosys
  • Extensive Knowledge on ITIL V3 practices and experience on Service now modules such as Incident Management, Change Management, Knowledge Management, CMDB, Service Catalogue
  • Extensive Knowledge on Reporting tools Administration, reports creation (Scheduled/Adhoc) for Business users, migrating reports from lower to Higher Versions, debugging report issues.
  • Good functional knowledge of the project Landscape and all interfacing systems interact with end-users and to identify the information needs and business requirements.
  • Excellent communication and interpersonal skills and ability to work in a team as well as individually.
  • Innovative in approach, quick Learner and Strict adherence to the client Project guidelines.

TECHNICAL SKILLS:

ETL: IBM Information Server 11.5/9.1/8.5 (Designer, Director, Administrator, Web Operations Console, Governance Catalogue), Hadoop, Teradata, SSIS, Informatica Power centre 9, IBM Web sphere Cast iron V 6.3.0

Data Modeling: Star Schema, Snow: Flake Schema, Dimensions & Fact Tables, Physical and Logical Data Modeling, Erwin

BI Tools: SAP Business Objects XI 3.1 and 4.0, IBM Cognos, SSRS and Tableau

Databases: Oracle 10g/11g, DB2 9.x, MS SQL Server 2005/2008 R2, My SQL

Environment: &OS: Windows 2003/2000NT 4.0, Windows 7 2009/NT6.1, Sun Solaris 2.x/5.x, IBM AIX 5.1/4.3, Unix and Linux

Languages: C++, Java, XML, SQL, T-SQL, PL/SQL, Java Scripting, UNIX Shell Scripting

PROFESSIONAL EXPERIENCE:

Confidential, Cincinnati, OH

Sr. Datastage Developer

Responsibilities:

  • Hands on Experience in Data Governance Catalogue (Integrated Data Dictionary, Stewardship, Lineage, Rules and policies, End to end Information blue print, Metadata Asset Management)
  • Worked extensively on Models and Designs Data Structures to Simplify and accelerate data and integration design for business intelligence, master data management(MDM) and service-oriented architecture initiatives
  • Simplify and speed up warehouse design, dimensional modeling and change management by providing a tool to the warehouse data modelers and database administrators to design and manage the warehouse from an enterprise logical model
  • Increase efficiency and reduce time to market through greater understanding of current data assets.
  • Simplify and accelerate data and integration design for business intelligence, master data management(MDM) and service-oriented architecture initiatives.
  • Leverage data-model value across the data lifecycle to promote cross-lifecycle, cross-role and cross-organization collaboration and to align process service, application and data architecture
  • Collaborating across multiple lines of business to build information policies, Governance rules, supporting regulatory requirement.
  • Build Rules, Domain, Identifiers, Identify the terms and assign them to a class provided by IBM or to create a own Class based on the Domain.
  • Check Integrity of Data usage and Perform Data quality Assessment.
  • Assessing the cost of poor data quality and managing data quality issues to closure
  • Engaging subject-matter experts across the Data Domain teams through business processes to review and approve corporate glossary changes
  • Assisting business users to play an active role in information-centric projects and to collaborate with IT team across the Enterprise.
  • Creating well-documented, end-to- end information blueprint to ensure business requirements are aligned with the enterprise Standards

Datastage Administrator

Confidential

Responsibilities:

  • Gathering the hardware/software requirements for the project creation and working with infrastructure team to allocate the resources needed for the migration project such as RAM, CPU, Processors, and Memory etc.
  • Collecting the artifacts needed and working with IBM Representative to build a model in aligning with Enterprise standards.
  • Building a clustered architecture for DEV, TEST, PROD, DR environments and making it as active/active (PROD,DR) or active/passive (TEST/DEV)
  • Developing batch and backup scripts to migrate all the job Dsx’s from the 8.5 version to 11.5 version and setting up the NLS settings across the project, Parameter sets, Environment variables, Config files etc.
  • Integrating Datastage with Big Data Hadoop and validating all the connectors.
  • Ensuring the appropriate Key store certificates are installed and renewing of the same as needed.
  • Validating the current scripts setup on the server used for Services recycle (DS Engine, Node Agent, WAS ) and rewriting them to the current requirements on 11.5
  • Applying Sales force, JDK patches to the Datastage Information server to make it scalable and integrated with cloud
  • Validating the connections for all the API’s provided in the tool and ensuring the end to end connectivity is validated for all the stages
  • Replacing the old connector and JDBC stages to Oracle connector tool through IBM Data modeler tool available in Version 11.5
  • Working on IBM Operations console to monitor the job performance, work load management, CPU, RAM Monitoring,
  • Worked on ISA Lite tool which involves collecting the artifacts needed for the IBM for ticket creation and working with IBM for driving it to closure ( Eg: Checksum stage issue from 8.5 to 11.5 and replace with Hash MDM stages)
  • Good experience in developing batch scripts as needed for the project
  • Identifying business needs, evaluated business and technical alternatives, recommended solutions and participated in their implementation
  • Developing ETL design, development, implementation standards and procedures based on industry best practices. Recommend software upgrades for existing programs and systems
  • Created Datastage Parallel jobs using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify, Filter, Change Data Capture, Change Apply, Sample, Surrogate Key, Column Generator, Row Generator, Etc.
  • Extensively worked with Join, Look up (Normal and Sparse) and Merge stages.
  • Extensively worked with sequential file, dataset, file set and look up file set stages.
  • Created Error files and Log Tables containing data with discrepancies to analyze and re-process data.
  • Migrated jobs from the development instance to testing environment.
  • Performed Unit and Integration testing and validated the test cases by comparing the actual results with expected results.
  • Worked on performance tuning and enhancement of Data Stage job transformations.
  • Resolved data issues during the testing processes and reloaded the data with all the necessary modifications.
  • Extensively used Parallel Stages like Row Generator, Column Generator, Head, and Peek for development and de-bugging purposes.
  • Developing Datastage jobs to do ETL Transformations with requirements provided and load respective Dimension and Fact tables.
  • Extensive usage of AQT, SQL Developer for analyzing data and writing SQL Scripts, PL/SQL scripts performing DDL operations.
  • Extensively used slowly changing dimension Type 2 approach to maintain history in database.
  • Extensively worked on Datavault 2.0 concepts (HUB, LINK and Satellite) for Data modeling
  • Participated in Daily Stand up meetings as per agile process and updating the user story status to the Scrum master and product owner.
  • Worked on analyzing the data generated by the business process, defining the granularity, source to target mapping of the data elements, creating Indexes and Aggregate tables for the data warehouse design and development.
  • Worked on Metadata Definitions, Import and Export of Datastage jobs using Data stage Designer.
  • Worked on Complex data coming from Mainframes (EBCIDIC files)to set the appropriate NLS and Record layout settings to read the data.
  • Worked on studying the data dependencies using metadata stored in the repository and prepared batches for the existing sessions to facilitate scheduling of multiple sessions.
  • Managing and upgrading existing applications and/or integrating application with any new/existing applications and databases
  • Working with Datastage designer and Director to load data from source to target databases according to the requirements provided
  • Working in all phases of the software development lifecycle, coherent with Data Cleansing, Performance Tuning, Unit Testing, System Testing, and User Acceptance Testing. Design Application or systems create models and diagrams to show the software code needed for an application.
  • Develop source to target mappings and specifications for data management scripts.
  • Analyze user’s needs and then design, test and develop software to meet those needs.
  • Developed scripts in PL/SQL and also wrote scripts in UNIX for testing and for loading data into tables and also for migrating the files into different environments.
  • Working on complete Software Development Life Cycle (SDLC) cycle to meet the Business requirements.
  • Working with tracker to track all the different job issues which need to be resolved and the new jobs which need to be created and also to support production issues.
  • Worked on IBM Clear case to check in and check out the codes and built batch scripts to reference the views created project wise and migrate the code to corresponding project applications(EDW/RMB/MERCURY/PDW/FDP/DNA/HRIT)
  • Preparing detailed work flow charts, Data modeling Diagrams that describe input, output and logical operation and converting them into a series of instructions coded in a computer language.
  • Writing, updating and maintaining computer programs or software packages to handle specific jobs such as tracking inventory, storing and retrieving data or controlling other equipment.
  • Involved in cleanup of Data in Fact and Operational Data store layer as a part of Data Retention and performance tuning with the help of stored procedures and DBA’s.
  • Worked on collecting the artifacts of jobs resulting in multiple error records to error log table, analyzing the root cause and fix them accordingly.
  • Have Knowledge on using Hive and Impala connector stages in Datastage V 11.5
  • Installed patch sets and upgraded Teradata.
  • Automated data warehousing and data mart refreshes using Transportable table space option.
  • Worked on creating and managing the partitions.
  • Performed database health checks and tuned the databases using Teradata Manager.
  • Deliver new and complex high-quality solutions to clients in response to varying business requirements and Creating and managing user accounts.
  • Installed Teradata drivers for the Teradata utilities.
  • Refreshed the data by using fast export and fast load utilities.
  • Used Teradata Administrator and Teradata Manager Tools for monitoring and control the system.
  • Creating and modifying MULTI LOADS for Informatica using UNIX and Loading data into IDW.
  • Loading data from various data sources and legacy systems into Teradata production anddevelopmentwarehouse using BTEQ, FASTEXPORT, MULTI LOAD, FASTLOAD and Informatica.
  • With the in-depth expertise in the Teradata cost-based query optimizer, identified potential bottlenecks with queries from the aspects of query writing, skewed redistributions, join order, optimizer statistics, physical design considerations (PI and USI and NUSI and JI etc) etc. In-depth knowledge of Teradata Explain and Visual Explain to analyze and improve query performance.

Environnent: IBM Infosphere Information Server 8.5/11.5 (Data Stage, Quality Stage), Oracle 11g, Oracle 10g, DB2, AQT, SQL Server 2008, Oracle 11g R2, Aix UNIX, Putty, WinSCP (FTP Tool), Linux, Windows NT, Rally, Clear Case, TWS Scheduler, SAP Business Intelligence

Confidential, Cincinnati, OH

Datastage Production Support Analyst

Responsibilities:

  • Monitoring the daily batch and troubleshooting the issues.
  • Providing quick response to the clients for the problems they face in production environment.
  • Involving at testing, log the defects, discuss the same with the developers on daily calls, and track them until completion.
  • Understanding technical challenges and providing efficient solutions
  • Coordinating between onsite and offshore team Confidential Bangalore.
  • Peer to peer knowledge sharing.
  • Interacting with different Interfacing systems and preparing the mapping documents related to batch flow and interdependency between application jobs.
  • Design work flow diagrams and design project document.
  • Development (Strictly following code standards and specifications).
  • Guide Team members in Support work and other challenges.
  • Review Code to ensure accuracy, completeness and quality of the deliverable
  • Quality improvement processes like conducting requirement and design walkthroughs and reviews to ensure quality of the deliverables.
  • Report the progress made and lists the scheduled tasks on a day-to-day basis to both the client managers and Confidential managers.
  • Actively participating in Process improvement techniques, gathering the requirements and driving it to closure.
  • Enhancement of the present functionality if required in alignment with standards
  • Provide post production support and user documentation.
  • Participation in Disaster recovery (DR) exercise to ensure systems is Business Continuity Plan (BCP) complaint.
  • Providing the reports for the Analysis and Research work carried out in the form of metrics
  • Converting recurring production failures into Problem ticket with Detailed Analysis, Root cause, and the corrective action required and drive them to closure in coordination with Development.
  • Validation and participation in production change walkthroughs. For any changes moving to production for fulfilling business requirements or for enhancing system performance. To ensure changes are moved to the system with Minimal impact to the services.
  • Participating in the Bridge/Incident calls for critical issues, preparing root cause Analysis documents, and updating the same in SharePoint.
  • Scheduling the jobs though IBM Tivoli work load scheduler with appropriate dependencies.
  • Participating in Knowledge Management to document the steps carried out to restore the service or resolve an issue and upload them into IRIS (Service Management Repository tool) so the articles can be reused.
  • Worked on Server patching which involves updating patches (Sales force, JDK, Web sphere Key SSL Certificates) and recycling of Application Services (Node Agent, DS Engine, JObMonApp, Resource tracker) pre and post patching.
  • End user validation for all the patching activities which involves ensuring successful restoration of Application and validating the client tools Designer, Director, Administrator and Web console
  • Deploying the code into the corresponding projects through a batch script as a part of production release in alignment with Change Management process which involves comprehensive, Infrastructure and Standard changes.
  • Ensuring the vulnerabilities for the Server, WAS/HIS are remediated in coordination with Development Architect and Admin teams.
  • Nominated as the ITIL and Knowledge Management Champion for the team by the client Manager which involves ensuring the team is following the Service Management life cycle standards and publishing, reviewing the Knowledge articles content and quality.

Environnent: IBM Infosphere Information Server 8.5/11.5 (Data Stage, Quality Stage), Oracle 11g, Oracle 10g, DB2, AQT, SQL Server 2008, Oracle 11g R2, Aix UNIX, Putty, WinSCP (FTP Tool), Linux, Windows NT, Rally, Clear Case, TWS Scheduler, SAP Business Intelligence

Confidential

Developer

Responsibilities:

  • Developed Java programs based on the Scenario provided using Eclipse
  • Introduction RDBMS, SQL/PL SQL Commands.
  • Execute and Output the results during Diagnostics for the queries asked using SQL/PL SQL.
  • End to end Knowledge guidance from Trainer on DW Fundamentals, Dimensional Modeling, Normalization forms, Slowly Changing Dimensions, Data Modeling.
  • Presented DW Fundamental Topics to other stream technologies as part of ILP Program
  • Introduction to ETL Tool Informatica power centre, Installation its features and components
  • Developed work flow’s, Maplets and Scheduler tasks using the Key performance Indicators using various Data Cleansing and transformation rules
  • Worked on Bugzilla Testing tool from Mozilla Foundation for ETL testing.
  • Introductory Session from Trainer on BI Reporting tools such as SAP Business Objects
  • As part of TRAI project team, identified the Key Performance Indicators and created Business Object Universe to cater specific needs as part of TRAI’s mission.
  • Created reports using OLAP techniques like Slice & Dice, Drill Down features and designed cross tab and Master Detail reports, Dashboard.
  • Involved in the Runtime Improvements and tuning and optimization of the critical queries of the application.

Environment: SQL, PL/SQL, ITIL, UNIX, INFORMATICA, and SAP Business Objects.

Confidential

SQL Developer

Responsibilities:

  • System study, interaction with users and management, performance of analysis, designing, coding, and testing and implementation of the system.
  • Extensively involved in Coding PL/SQL, Packages, Procedures, Functions and Database Objects according to Business rules.
  • Designed and implemented number of Data Entry Forms using Developer 2000. Various management reports like Cash Flow statement, Cash/Bank statement, Account statement, Balance sheet, Sales/Purchase register.
  • Designed and prepared specifications and did module level testing using over 45 forms (Forms 4.5), 40 reports (Reports 2.5), 35 stored procedures and database triggers for company’s reconciliation.

Environment: Oracle 7.3/8i, SQL, PL/SQL, Developer/2000(Forms 4.5, Reports 2.5) and Sun Solaris/Windows-NT.

We'd love your feedback!