Pega Robotics Developer Resume
Eagan, MN
SUMMARY
- Over 13 Years of experience in System Analysis, Design, Development and Quality Analysis. Over 5 Years of experience in the fields of Data warehouse and Data Mart applications using IBM Information Server 8.x/9.x Data Stage, Oracle, PL/SQL, SQL Server, DB2, TOAD, Erwin, VISIO and Shell Scripts.
- Extensive experience in Data warehousing tools IBM Information server 9.x/8.x(Designer, Director and Administrator) and Ascential Datastage 7.5.2/7.5.1 (Manager, Designer, Director and Administrator) and Data Modeling Tools
- Extensive experience with Oracle, DB2, MS SQL Server databases.
- Extensive experience working with Oracle 8i/9i/10g including PL/SQL programming.
- Experience with Datastage Administration activities.
- Data modeling knowledge using Dimensional Data Modeling, Star Schema Modeling, Snow Flake Modeling, Fact and Dimension Tables
- Extracted data from heterogeneous sources like SQL server, Oracle, DB2.
- Extensive experience in developing strategies for Extraction, Transformation, Loading (ETL) data from various sources into Data Warehouse and Data Marts using Ascential Datastage (Designer, Director and Manager).
- Experience in Shell Scripting (Korn Shell).
- Experience in BI Reporting tool (Business Objects).
- Experienced in using highly scalable parallel processing Infrastructure using Data Stage Parallel Extender (DS - PX)
- Experience in integration of various data source like Oracle, SQL Server, DB2.
- Ability to work with client to identify source-target data and their availability, data extraction, processing and testing skills, proficient with multiple extraction, transformation and load tools.
- Experience with Conceptual, Logical and Physical data modeling.
TECHNICAL SKILLS
ETL: IBM Ascential DataStage 9.x/8.x Administrator, Designer, Director, Manager, Parallel Extender, IBM WebSphere IBM Info Sphere Information Server 8.x/9.x
Data Modeling: Dimensional, Logical & Physical Data Modeling, Star Schema, Snowflake Schema, ER-Win 7.x
RDBMS: Oracle 11g/10g, DB2 10.x/9.x, SQL Server 2010
Programming: PL/SQL, K-Shell Scripting,VB Script, C, SAS, R, Python
Reporting: Business Objects, Cognos Reports 9.x/8.x
Operating System: Unix, Linux, Windows NT/XP/7 server
Scheduling Tools: Control-M, Autosys, Crontab Utility, IBM Tivoli, Star Team, Robot Scheduler
PROFESSIONAL EXPERIENCE
Confidential, Eagan, MN
PEGA Robotics developer
Responsibilities:
- Creating Open Span automation for downloading data into excel spreadsheets from one application.
- Designed, developed and tested Bot tasks using OpenSpan.
- Experienced in Interrogating Web Adapter, Windows Adapter and then Text Adapters.
- Handled Error handling mechanism and used diagnostic tools for debugging.
- Experienced in working with Office Connectors and PDF Connector.
- Experienced in Runtime Automation diagnosing and troubleshooting activities.
- Implemented Interrogation mechanisms with Web and Windows Applications.
- Involved in upgrading Open Span from 7.1 to 8.0 version.
- Involved in automating the download of a file in internet explorer using Pega Robotic Studio 8.
- Involved in downloading an attachment from WebUrl through windows adapter.
Environment: OpenSpan7.1, ASP.NET, C#, Visual Studio.NET 2008, MVC 4.0, T- SQL, XML, MVVM, SQL Server 2008, Oracle 9i, IIS, PowerShell, Ext JS.
Confidential, Chicago, IL
Data Analyst / ETL Developer
Responsibilities:
- Involved in implementing star schema for thedatawarehouse using Erwin for Logical/Physicaldata modeling and DimensionalDataModeling.
- Developed various jobs using Aggregator, Sequential file, Transformer stages, Change Capture.
- Developed Parallel jobs using Stages, which includes Join, Transformer, Sort, Merge, Filter and Lookup.
- Used Shared Containers and created reusable components for local and shared use in the ETL process.
- Analyzeddatawith discrepancies through Error files and Log files furtherdataprocessing and cleansing.
- Used DataStage Director Runtime engine to schedule and execute developed jobs and job sequences, and use Log events to monitor job progress and performance.
- Designed and developed ETL processes using DataStage designer to loaddatafrom Oracle, MS SQL, Flat Files (Fixed Width) and XML files to staging database and from staging to the targetData Warehouse database
- Participated in weekly status meetings, and conducting internal and external reviews as well as formal walkthroughs among various teams, and documenting the proceedings.
- Involved in mapping team discussions to solve the issues in mapping document.
- Worked on troubleshooting, performance tuning and performance monitoring for enhancement of DataStage jobs.
- Participate in walkthroughs of business requirements and design specifications for the projects•
- Involved in detail understanding of Star Schema and Snowflake architecture
- Involved in Extensive testing of Type I and Type II Dimension tables and Fact tables
Environment: IBM InfoSphere Information Server DataStage 9.x/11.x, HP ALM, SQL SERVER, SQL, UNIX, Windows XP, Oracle 10g/11g Agile.
Confidential, Chicago, IL
Data Analyst
Responsibilities:
- Learned and understood IC Plus and Pharmacy Renewal systems.
- Worked in the agile development environment with frequently changing requirements.
- Automated the ETL testing process using Talend tool to validate source and target systems as per test cases.
- Validated Mapping document between source and target systems.
- Validated record count between source and target systems using Talend ETL tool.
- Performed Null and Duplicate Validations using Talend ETL tool.
- Validated Incremental loading using Talend ETL tool.
- Performed Date validation (For example, From Date should not greater than To Date, Date values should not have any junk values or null values)
- Actively participated in Scrum meetings, reviews and test coordination activities.
- Work closely with the development, support team to resolve any client/consultant/system issues.
- Participate in design walk-through and review test scripts to ensure that they meet the business objectives.
Confidential
SAP QA Analyst
Responsibilities:
- Extensively analyzed/tested various Confidential application modules (Agreements, Awards, Billbacks, Chargebacks, Deals, Sales Rebates, & Incentives, Royalties) to validate and verify their various functionalities.
- Participated in Business Requirement walkthroughs along with Business Analysts and development team to better understand the new features of application modules.
- PreparedTest Scenarios and Test Cases as per functional, Business requirements, Knowledge transfers given by developers.
- Successfully managed integration testing of Confidential applications with SAP HANA system.
- Learned and understood the application products developed by the business KTS prior to start writing any new test cases for every new release.
- Work closely with the development, support team to resolve any client/consultant/system issues.
- Participate in design walk-through and review test scripts to ensure that they meet the business objectives.
- Identified functionality and Performanceissues, including: deadlock conditions, database connectivity problems and system crashes under load
- Involved in developing end to end test scenarios for various application modules.
- Involved in Premium Qualification testing and User Acceptance testing as well.
- Interacted with offshore team to resolve defects and functional issues of Confidential application modules.
- Responsible for reviewing functional specs design documents, test case writing, test execution, reporting and resolving issues, follow up, writing release notes and prepare documentation for knowledge transfer.
- Actively participated in the analysis, testing, deployment & production support phases of SAP Confidential release and support packs.
- Analyzed business requirements and processes in order to create test plan, configure the end to end business scenarios to test and minimize client issues.
- Managed Regression testing, Performance testing, User Acceptance (UAT) testing and Security testing to comply with SAP America's standard in each testing cycle to ensure the highest quality product.
- Worked with ALM/Quality Center to trace the requirements, write the Test Cases in the test plan, execute the Test Sets in the test lab and track the Defects logged in the defects module.
- Generated daily test executions, defects tracking and RTM status report using tools like HP ALM/Quality Center.
- Designed test scenarios based on the use cases in HP Quality Center.
- Generated requirements traceability matrix (RTM) using HP Quality Center.
- Developed test plan for Order Management Service using HP Quality Center.
Data Analyst/ ETL Developer
Responsibilities:
- Created datastage jobs for purging daily snapshot tables which are older than 45 days of records.
- Involved in validation framework for reconciliation of jobs for meeting QCs such as performing data count, data comp between source and target data.
- Modified the existing datastage jobs to make reusable for other business functions such as publishing the data into target tables.
- Worked with DBA team to refresh the DEV tables from PROD if necessary as part of Unit testing.
- Used DataStage Director to monitor, analyze and debug Datastage jobs.
- Used Various Parallel Stages Lookup, CDC, Merge, Join, Modify, Funnel, Filter, Switch, Aggregator, Remove Duplicates and Transformer, Peek Stages etc extensively.
- Unit tested DataStage Jobs in development including creating the appropriate test data.
- Used Control M scheduler to automate the execution of all ETL Jobs.
- Used Datastage Director for Job Scheduling, monitoring performances and for troubleshooting from LOG files.
- Worked with Datastage Administrator to define the project properties etc.
- Designed Job sequences to control and execute the Datastage jobs using various Activity stages like Job Activity, User variable, Notification Activity, Exception Handler etc.
- Developed documents like Source to Target mapping for developing the ETL jobs.
- Designed and implemented slowly changing dimensions (SCD's) methodologies.
- Responsible for monitoring all the jobs that are running, scheduled, completed and failed. Troubleshooting the failed jobs is a primer in these situations.
- Involved in the deployment of DataStage jobs from Development to QA environment.
- Effectively and efficiently interacted with the client in gathering business requirements for the ETL module.
- Gathering the metadata definition of the source systems, preparing Transformation rules according to the business requirements for new enhancements.
- Actively involved in everyday job monitoring and resolved all the issues which are related to job aborts and job failures.
- Redesigned few jobs in Datastage Designer to meet the changes in new incoming feeds.
- Involved in importing and exporting jobs category wise and maintaining the backup regularly.
- Good experience in working as coordinator to offshore team from Onshore.
- Worked closely with the Data analyst and business analyst during the design and development of ETL technical specification document.
- Working on improving performances of the jobs to meet strict timelines.
- Worked with Squirrel to interact with Oracle and used the tool for testing.
- Used Change Data Capture (CDC) technique to capture inserts, deletes and updates and made changes to the target database. Created reusable components like parameter sets.
Environment: Info sphere Information Server DataStage 9.1, 8.7 and 8.5 versions, DB2-UDB 9.0, Win SCP 4.3.7, PUTTY 0.62, K-Shell Scripts, Squirrel Client, Control M, Star Team, HP UNIX, Windows 7.
Confidential, Northbrook, IL
Data Analyst/ ETL Developer
Responsibilities:
- Created ETL Process flows, Design Specs Docs, Unit Test Cases for DataStage ETL jobs.
- Worked with Third Party Vendors to receive the appropriate file formats of Source Data Files.
- Worked with Business Analysts and Subject-Matter Experts to fill up the GRAY AREA in the Business Requirements before implementing the comprehensive ETL solutions.
- Worked with DBA team to refresh the DEV tables from PROD if necessary as part of Unit testing.
- Create DDL queries by analyzing the Source Data file and provided the same to DBA team for creating new tables in DEV environment.
- Created Migration documents for all iMART Project Enhancements (Dev to QA & QA to PROD)
- Worked with Data Stage Admin to migrate all the required ETL jobs, SQL scripts, Shell scripts to respective environments (Dev to QA / QA to PROD)
- Involved in installing Data Stage 8.1 /8.5 Softwares, Patches, Fix Packs and also, worked on both the versions of Data Stage as well.
- Created K-Shell Scripts to read data from EXCEL and CSV files.
- Modified the K-Shell Scripts to get the source files from FTP server into local Unix server.
- Modified the data stage job to load the Source File data based on data month rather by system date.
- Used DataStage Director to monitor, analyze and debug DataStage jobs.
- Used Various Parallel Stages Lookup, CDC, Merge, Join, Modify, Funnel, Filter, Switch, Aggregator, Remove Duplicates and Transformer, Peek Stages etc extensively.
- Generated XML files by using the XML output stage as a part of Delta Job process.
- Created job schedules in ESP to automate the ETL process.
- Involved in the performance tuning during historical & Daily loads and reduced the batch window and also in making the application Robust.
- Unit tested DataStage Jobs in development including creating the appropriate test data.
- Used Active Batch scheduler to automate the execution of all Datastgae ETL Jobs.
Environment: Infoshere Information Server DataStage 8.7/8.5, Oracle SQL Developer 3.0, Oracle 11g, Aginity Workbench for Netezza 3.0, Netezza 6.0.5, Win SCP 4.3.7, PUTTY 0.62, K-Shell Scripts, Active Batch, HP UNIX, Windows 7.
Confidential, NewYork, NY
Data Analyst/ ETL Developer
Responsibilities:
- Worked with architects and subject-matter experts to design comprehensive ETL solutions.
- Involved in Defining Best ETL practice doc, Development standards doc for DataStage ETL Jobs.
- Created ETL process flows, Design spec documents, Unit test cases for DataStage ETL jobs.
- Used various Parallel Stages Lookup, CDC, Merge, Join, Modify, Funnel, Filter, Switch, Aggregator, Remove Duplicates and Transformer, Row Generator, Column Generator Stages etc extensively.
- Monitor jobs for performance issues and troubleshooting.
- Provide performance tuning insight to project teams.
- Involved in creating AutoSys for Scheduling the Job dependencies and Timings.
- Used DataStage Director to monitor, analyze and debug DataStage jobs.
- Unit tested DataStage Jobs in development including creating the appropriate test data.
- Designed DataStage sequences to specify Job execution order.
- Created UNIX Shell Scripts as wrappers to automate the process of running DataStage jobs and to track DataStage Job logs and Script logs.
- Implemented Surrogate key by using Key Management functionality for newly inserted rows in Data warehouse which made data availability more convenient.
- Implement Slowly Changing Dimensions (Type1 and Type2) using DataStage ETL jobs.
- Developed job sequences to execute a set of jobs with restart ability, check points and implemented proper failure actions.
- Used DataStage Designer to create the table definitions for the CSV and flat files, import the table definitions into the repository, import and export the projects, release and package the jobs.
- Experienced in using advance datastage real time stages like Web Services, XML and used Regroup, parser, hjoin and sort steps in xml.
- Redesigned many existing ETL jobs (SQL scripts, Shell Scripts) in DataStage as per the Framework approach.
- Provided support to the QA team to run all the respective objects in DEV, UAT environments during scheduled Cut-over's at various phases of the project.
- Responsible for performance tuning of DataStage code to increase the performance.
- Created objects like tables, views, Materialized views procedures, packages using Oracle tools like PL/SQL, SQL*Plus
- Created shell script to run data stage jobs from UNIX and then schedule this script to run data stage jobs through scheduling tool.
Environment: Ascential DataStage 7.5/8.1 (IIS with Multi Client), TOAD, Oracle 10g/11g, SQL, PLSQL,Shell Scripts, AutoSys, HP UNIX, Windows XP.
Confidential
Data Analyst/ ETL Developer
Responsibilities:
- Developed DataStage jobs based on System Use case Documents, Source-Target mappings and Supplementary Specifications.
- Implemented the transformation, custom, compare and match rules for both general and crosswalk plans while extracting the data from BHI claims and BHI provider Master schemas.
- Created DataStage jobs for loading dimension and fact tables.
- Created a DataStage job to implement logic to make sure the same provider records should not be loaded into the table again while running the datastage job with the same plan code.
- Taken the responsibility of deploying the datastage jobs, ddl scripts, sql scripts to various environments like Testing and Performance Engineering.
- Invloved in DDL activities like Creating a table,alter table and droping table as part of deployment process.
- Administered the DataStage for security hazards, denied access to non authorized servers.
- Involved in writing the Deployment Guide and Release Notes for various releases of the project.
- Closely worked with Testing team to resolve any deployment issues and assigned defects.
- Performed DRY RUN of the datastage jobs for the current release in PE team using Tivoli setup in order to avoid any deployment issues like missing job sequence, Tivoli setup issues or missing reference data etc.
- Used Tivoli to schedule jobs and e-mailed the status of ETL jobs during smoke and regression testing.
- Used Director Client to validate, run, schedule and monitor the datastage jobs that are run by WebSphere DataStage server.
- Used DataStage Designer to develop parallel jobs to extract, cleanse, transform, integrate and load data into Data Warehouse.
- Administered DataStage projects, managed global settings and provided command interface to the datastage repository.
- With Datastage Administrator, set environment variables, job monitoring limits, user privileges and job scheduling options.
- Used DataStage Director to schedule, monitor, analyze and debug DataStage jobs.
- Designed DataStage job streams to specify Job execution order using Tivoli setup.
- Tuned DataStage jobs to obtain better performance by using various tuning techniques.
- Unit tested DataStage Jobs in development including creating the appropriate test data.
- Export and import table definitions using DB2 plug-ins for various purposes.
- Worked on call for production support.
Environment: IBM Infosphere 8.5.(Latest version of DataStage), PL/SQL, IBM DB2UDB 8.x, Erwin 7.x, Toad 9.x, IBM Cognos 8.x,ClearQuest. ClearCase, RequisitePro, Tivoli and Quality Center.
Confidential, Deerfield, IL
ETL Developer
Responsibilities:
- Developed DataStage jobs Installation, CheckList, Run Sheet & Unit Test Plan documents.
- Modified the current existing DataStage jobs as per Confidential ’ LDS (Unix Lockdown Structure) standards.
- Implemented crossover claims functionality (to exclude claims that have a date of service that falls in a prior reporting calendar year).
- Implemented formulary look up functionality (The Evaluation Period will begin with the first day of the Look-Back period and end with the last day of the Current EOB Month).
- Implemented Detail Design documents on formulary look up & cross-over claims functionalities.
- Used Crontab to schedule jobs and e-mailed the status of ETL jobs to operations team daily/weekly/Monthly.
- Used Director Client to validate, run, schedule and monitor the jobs that are run by WebSphere DataStage server.
- Used DataStage Designer to develop parallel jobs to extract, cleanse, transform, integrate and load data into Data Warehouse.
- Used DataStage Director to schedule, monitor and analyze DataStage jobs.
- Used DataStage Manager for job management, migrating jobs or the whole project between different environments
- Developed jobs in Ascential Parallel Extender PX using different stages like Transformer, Aggregator, Lookup, Join, Merge, Modify, Remove Duplicate, Oracle Stage, Sort, Peek, Row Generator, Column Generator, Sequential File and Data Set.
- Designed DataStage sequences to specify Job execution order.
- Tuned DataStage jobs to obtain better performance by using various tuning techniques.
- Involved in setup and configuration of Datastage Enterprise edition.
- Implemented email, zip functionalities with UNIX shell scripts to use them in DataStage jobs.
- Imported and exported Repositories across DataStage projects using DataStage Manager
- Unit tested DataStage Jobs in development including creating the appropriate test data.
Environment: Ascential DataStage7.5.2, Oracle11g,, PL/SQL, Toad, Windows XP,CronTab