- Big4 consulting experience as well as 14+ years of IT experience in Implementation, Audit, Maintenance of Data Warehousing, Data Integration, SAP and Mainframe systems.
- Ability to collaborate with peers in both, business, and technical areas, to deliver optimal business process solutions, in line with corporate priorities
- Excellent skill sets in ramping up new teams to capture challenging opportunities in less time.
- Experience in various phases of Software Development life cycle (Analysis, Requirements gathering, Designing) with expertise in documenting various requirement specifications, functional specifications, Test Plans, Source to Target mappings, SQL Joins.
- Good understanding of Relational Database Design, Data Warehouse/OLAP concepts and methodologies
- Implemented Optimization techniques for better performance on the ETL side and also on the database side
- Experience in creating functional/technical specifications, data design documents based on the requirements
- Excellent Communication, interpersonal, analytical skills and strong ability to perform in a team as well as individually.
- Proficiency in Big Data Practices and Technologies like HDFS, YARN, Map Reduce, Hive, PIG, HBASE, SQOOP, OOZIE, ZOOKEEPER, Flume, Kafka, Impala, Spark.
- Strong Knowledge of using PIG and Hive for processing and analyzing large volumes of data.
- Proficient in extending core functionality of Hive and PIG by writing the Custom UDF’s using Java on user requirement.
- Eight years of experience working with Data Stage 6.x/7.x/8.x (Parallel Extender), Ab Initio ETL tools and Teradata Fast Load, Multi Load utilities.
- Vast knowledge of Bill Inman and Ralph Kimball Methodologies, Advanced Data Warehousing Architecture and Multidimensional Data Modeling Techniques.
- Ability to design for operability, to design for reuse, to design for unit and system testing, to design strategies, to design the full ETL production life cycle.
- Familiar with best practices of Ab Initio, IIS(Data Stage), Teradata with respect to each phase of ITIL/SDLC.
- Willing to relocate: Anywhere
Operating System: Windows 10/NT, UNIX, AIX 5.3, MVS (VSAM),Z13, IBM System/370
Database / Development: Hadoop Big Data, Teradata, Fast Load, Multi Load, Oracle 10g, ODBC, SQL, PL/SQL, SQL Server 2000, DB2 UDB,SQL*Loader and SQL Plus, Export/Import
Languages: MDX, Cobol, JCL, CICS, BTEQ, Java
BI Tools: DataStage8.x/7.X/6.X, Parallel Extender (Manager, Designer, Director, Administrator), Ab Initio Co Op 2.14, COGNOS, MicroStrategy, OWB (Oracle Warehouse Builder), MDB, DMExpress SYNCSORT 4.2 (Change Data Capture Tool), Informatica Data Quality (IDQ)
Modeling and Project Management Tools: ERWIN 3.5, TOAD, SQL Developer, MS Project 2000 (Project Scheduling, Resource Allocation, Tasks, Sub Tasks, Charts - Certified)
Methodologies: Ralph Kimball, Bill Inman and IBM Data Warehousing Methodologies
Business Process Control: Business Discovery, Process Definition, Process Cycle, Data Interaction
Application Design and Development: Data Warehousing and Modeling using Multi Dimensional (Star/Snowflake) and E-R approach in a large Enterprise Data Warehouse (EDW) applications, Operational Data Stores (ODS), Data Marts, Multi Dimensional Data Cube Design for various Industries (Financial, Insurance, Mortgage and Banking).
Reporting & Administration: OLAP, ROLAP, HOLAP and MOLAP using COGNOS 8.0, Micro Strategy.
Project Manager Ab Initio(EDW)/Hadoop
- Spearheading responsibility for managing the gamut of operations pertaining to multiple DWH projects
- Managing Development, Production Support and Operations projects with onshore and offshore model
- Knowledge project lifecycle and methodologies and implement as per the project requirement
- Involved in business requirements gathering and estimations
- Working towards to meet SLAs as per SOWs
- Act as the liaison between the Customers, Business Analysts, Development teams and Support Teams
- Changing the priorities, Tracking, Managing Deliverables
- Responsible for planning and coordinating the resources to deploy a major Release within the predicted cost, time and quality estimates.
- Coordinating for planning and controlling the movement of Releases to test and live environments and make sure correct components are released
- Responsible for delivering a particular service within the agreed service levels and to ensure that standards and guidelines are followed
- Maintain, manage and document all project reports, statements and presentations as per stakeholder requirements
- Ensures that the teams are able to gather, analyze, store and share knowledge/information.
- Delivered POC’s for Big Data technology adoption exhibiting process speed enhancements.
- Extensively associated with Confidential center of excellence group of Big Data.
Environment: PeopleSoft, Ab Initio, Map Reduce, PIG, Hive, Scala, Spark, Oracle, MS Project 10, Nateeza, Unix, Windows
Confidential - Charlotte, NC
- Gathered inputs from architects & technical resources and prepared with technical solutions and effort estimation.
- Prepared high level work breakdown structure (WBS)
- Prepared responses for Vendor Pool/Pre Qual Opportunities
- Involved in end-to-end proposal development that includes technical solution, estimation and pricing.
- Performed Database testing and the Report level testing as per the requirement with excellent knowledge in understanding the data workflow by referring through FSD’s (Functional Specification Document).
- Excellent understanding of mapping between Source and Target by referring the mapping document.
- Performed end to end mapping testing for database as well as reports. The type of mapping involved is one to one and its lift and shift process that means need to check whether the data gathered in target table is mapped properly to the Source table and the same target table is populating the same records into the report tool (SAP-BO, Qlikview) properly.
- Performed Smoke test to do the primary checks like record counts, Column matching for database and Dashboard testing.
- Performed testing at SIT (System Integration Testing) level and UAT (User Acceptance testing) level.
- Gathered requirements from the development team and database developers to analyze the tables and entity relationships for understanding the database.
- Designed the integration document/xls. Derive input and output of each of the integration points.
- Documented the acceptance criteria for each of the test cases. Built the test cases based on test scenarios.
- Created test plan and strategy for the given LOB (Line of Business).
- Verified of import/export and obfuscation data.
- Verified of known issues, development of work arounds and wrappers as required.
- Identified data scenarios and business cases. Created test case development.
- Scripted and automated test cases. Identified source data pattern and reports.
- Developed scripts for comparison with target. Planed and run the SIT (System Integration Testing) for the given LOB.
- Developed test cases, established traceability between requirements and test cases.
- Performed data analysis to determine the completeness and accuracy of the data or checked if new data needs to be pulled up/ requested.
- Executed test cases & log results, performed data validation testing as appropriate, tracked defects and participated in defect resolution.
- Provided inputs to the test lead for documentation and reporting purposes. Identified, documented and updated testing dependencies and participants.
- Identified primary point of contact to raise the risks/issues around testing dependencies.
- Reported status on test execution including risks/issues and targets. Updated latest information in regular testing status meetings with all involved constituencies to ensure smooth test execution and timely issue resolution.
Environment: Data Stage, Ab Initio, HP-ALM, SharePoint, MS-Visio, MS-Excel, Teradata SQL Assistant, Qlikview, SAP-BO, Oracle 11g, Microsoft SQL Server.
- Over see projects and serve as point of escalation to ensure project objectives, benefits, budgets and time lines are met or exceeded.
- Strong familiarity with industry recognized project management and business analyst methodologies.
- Confidential is a major Auto financial org which required the IBM banking Data warehouse Logical model to be customized, re-engineered, designed and deployed with minimal support and documentation.
- Customized the IBM model successfully, Complex ETL jobs were designed developed, successfully developed the Enterprise data Warehouse conforming to 3NF, successfully designed the Dimensional data marts in Star Schema.
- LAO BI data warehouse and data mart implementation program uses technologies Data Stage, Oracle, ERWIN.
- Assist and support project team members in completing the tasks.
- Mitigated the issues among team members.
- Motivated the team members.
- Review, asses and evaluate execution of the projects on a regular basis.
Confidential, Brooklyn, NY
Data Migration Specialist-Data Stage/EDW
- Conducted interviews with technical analysts and business analysts regarding interfaces.
- Prepared current interface documents i.e. specifications, flow diagrams and mapping spread sheets
- Performed data integrity checks and prepared exceptions reports using Data Stage ETL tool.
- Designed new interfaces and helped developers with respect to Data Stage jobs.
- Involved in Operations Data Store (ODS) design and deployment activities.
- Migrated data from various legacy systems to staging area.
- Help is extended for complex data issues and also resolved vendor specific issues.
Environnent: IIS (Data Stage 8.01- Enterprise Edition), ERWIN, Z/OS, Z-Linux, SharePoint, DB2, Web Services, MS Project 2005
Tech Lead/ Enterprise Integration Analyst
- Wrote and reviewed technical requirements for reusable components of translation, preprocessing, derivation, validation, aggregation, technical reconciliation key, code combo key, load ready file for ECC & BI.
- Lead the team of 12 people and helped senior management with respect to status tracking and updates using detailed project plan.
- Helped in design and development of interfaces for batch as well as manual journal entry posting.
- Leveraged the enhanced features of Data Stage enterprise edition with respect to parallelism, web services etc.
- Helped in preparing sample data with the help of business team.
- Prepared global integration design documents as high level design.
- Prepared logical data integration model (LDIM) as detail design.
- Participated in brain storming sessions of functional data model (FDM) and physical data model (PDM) for ECC as well as BI.
- Reviewed complex SQL queries and helped in performance tuning.
- Helped in key data elements identification for both relational data model (ECC) and dimensional model (BI).
- Participated in code block and extended code block elements working sessions.
- Participated in peer reviews of functional specs.
- Designed currency split, document split, and technical reconciliation key assignment for the process of transforming data into load ready documents.
- Participated in peer reviews for approving the functional requirements.
- Participated in code reviews of Data Stage, Java and Quality Stage ETL jobs.
- Delivered the common source template for Type-1(Balanced journal entry), Type-2(Single journal entry) source files.
- Designed and helped in development of data quality audit tracker i.e. automatic error handling process.
Environnent: IIS (Data Stage 8.01- Enterprise Edition), ERWIN, SAP ECC, SAP BI, SAP MDM, Z/OS, Z-Linux, PVCS, DB2, Web Services, ABAP, Java, XML, Autosys, Excel, Visio, E-Cub, MS Project 2003
Confidential, Atlanta, GA
- Understanding the existing environment, load processes and system standards
- Requirements gathering, technical design.
- Involved in Preparation of Functional Specifications, Data Dictionary, Project Plan, source to Target mapping Document.
- Responsible for design, development, implementation and tuning of Logical and Physical Data Model and ETL process for Policy Delivery Data Mart. Involved in Design of Fact, Aggregate tables and modification of confirmed dimensions along with data analyst team.
- Created DataStage jobs to load staging tables, Dimension tables, Fact tables using different transformations like Source Qualifier, Aggregator, Filter, Expression, Lookup, Update Strategy, Router and Joiner transformation etc.
- Responsible for development and support of ETL routines required for populating the data warehouse, Experience in loading high-volume data, Tuning and troubleshooting of mappings.
- Worked on COGNOS frame work model to design reports.
- Scheduled the Loading processes and monitored sessions.
- Unit testing and System Integration Testing, Coordinate with offshore for tasks & status.
Environment: DataStage 7.5.1, TOAD, SQL, PL/SQL, ERWIN 3.5, COGNOS 8.0
- Resource planning and delegation of tasks
- Tracking the status using MSP 2003
- Updating the status to the concerned managers
- Captured the metrics with respect to defects, service requests as per CMMI Level 5 standards
- Organizing the training programs as per requirement.
- Understanding the existing environment, load processes and system standards
- Migrating data from SAS system to Oracle system using DataStage Parallel Extender
- Migrated PL/SQL modules into DataStage jobs
- Participated in developing Logical schema design and physical design
- Creating automated canned reports, automated alerts, user rights, dashboard metrics
- Designed ad-hoc reporting framework using MicroStrategy
- Unit Testing and System Integration Testing
Environment: DataStage 7.5.1, Ab Initio 2.1.4, Oracle 10g/9i, SQL, SQL*Loader, SAS, ERWIN, MSP 2003.
Consultant Data Stage
- Worked with business customers for requirements gathering, Logical Design, Physical Design model meetings to understand the data flow, frequency of data loads.
- Worked with architects and subject-matter experts to design comprehensive solutions.
- Designed and coded the ETL logic using DataStage to enable initial load and incremental processing and exception handling, restart ability and recovery, data cleanup, validation and monitoring.
- Captured Data from variety of sources including ASCII files, EBCDIC files, and Main Frame datasets.
- Used Complex Flat File, Dataset Stages depending on the requirement and layout for parallelism of data.
- Used Join/Merge/Lookup Stages to replicate integration logic based on the volumes of the data.
- Used Most of the Other Parallel Stages like Row Generator, Column Generator, Modify, Funnel, Filter, Switch, Aggregator, Remove Duplicates and Transformer Stages etc extensively.
- Used DB2 UDB’s Export/Import/Load Utility for Fast Loading or Unloading of Data into DB2 Tables.
- Developed BTEQ scripts and used Fast Load, Multi Load utilities to load TERADATA tables.
- Created jobs in DataStage to extract from data sources like SQL Server 2000.
- Involved in Code Migration from one environment to another environment using DataStage Manager’s Import/Export functionality for Migrating DataStage Jobs.
- Extensively used DataStage Director utility Monitor and debug DataStage code.
- Involved in DataStage Administration tasks such as creation of Projects, Environment variables and configuration files.
- Worked on Framework Model of COGNOS for designing reports
- Involved in creating Autosys for Scheduling the Job dependencies and Timings.
Environment: Ascential DataStage7.5/7.1 (Parallel Extender), Oracle 9i, DB2 8.0, Teradata 6.0, Fast Load, Multi Load, Fast Export, BTEQ, SQL, PLSQL, Shell Scripts, AutoSys, PVCS Version Control, COGNOS, Crystal Reports XI, UNIX AIX 5.2, Windows XP.
Data Stage ETL Developer
- Involved in Source systems Analysis.
- Discussed with Data Architect and gathered requirements.
- Worked on tuning the SQL and DataStage jobs to improve performance of various reports.
- Worked on conversion of DataStage server jobs to Parallel extender thereby improving the performance of the batch cycle.
- Worked on scheduling jobs using Autosys and wrote scripts to insert/append jobs, conditions, analyzing dependencies etc.
Environment: DataStage 7.0/7.1(EE), Oracle 9i, Oracle Designer 2000, Cognos Report Net, Cognos8, SunOS UNIX 5.8, AutoSys, Lotus Notes.
Confidential, Boston, MA
- Designed & Developed Star Schema database and mappings between sources and operational staging targets.
- Extracting, cleansing, transforming, integrating and loading data into data warehouse using DataStage Designer.
Environment: DataStage 6.2, Oracle 8i, DB2 UDB 8.1, PeopleSoft 8.x, SAP 4.x, Siebel 5.0, MS SQL Server 2000, MS Access 97/2000, MS Excel 2000, Micro Strategy, SQL * Loader, Unix AIX 4.3x and Windows NT 4.0.