Project Manager Ab Initio(edw)/hadoop Resume
Charlotte, NC
SUMMARY
- Ability to collaborate wif peers in both, business, and technical areas, to deliver optimal business process solutions, in line wif corporate priorities
- Excellent skill sets in ramping up new teams to capture challenging opportunities in less time.
- Experience in various phases of Software Development life cycle (Analysis, Requirements gathering, Designing) wif expertise in documenting various requirement specifications, functional specifications, Test Plans, Source to Target mappings, SQL Joins.
- Good understanding of Relational Database Design, Data Warehouse/OLAP concepts and methodologies
- Implemented Optimization techniques for better performance on the ETL side and also on the database side
- Experience in creating functional/technical specifications, data design documents based on the requirements
- Excellent Communication, interpersonal, analytical skills and strong ability to perform in a team as well as individually.
- Proficiency in Big Data Practices and Technologies like HDFS, YARN, Map Reduce, Hive, PIG, HBASE, SQOOP, OOZIE, ZOOKEEPER, Flume, Kafka, Impala, Spark.
- Strong Knowledge of using PIG and Hive for processing and analyzing large volumes of data.
- Proficient in extending core functionality of Hive and PIG by writing the Custom UDF’s using Java on user requirement.
- Eight years of experience working wif Data Stage 6.x/7.x/8.x (Parallel Extender), Ab Initio ETL tools and Teradata Fast Load, Multi Load utilities.
- Vast knowledge of Bill Inman and Ralph Kimball Methodologies, Advanced Data Warehousing Architecture and Multidimensional Data Modeling Techniques.
- Ability to design for operability, to design for reuse, to design for unit and system testing, to design strategies, to design the full ETL production life cycle.
- Familiar wif best practices of Ab Initio, IIS(Data Stage), Teradata wif respect to each phase of ITIL/SDLC.
- Willing to relocate: Anywhere
TECHNICAL SKILLS
Operating System: Windows 10/NT, UNIX, AIX 5.3, MVS (VSAM),Z13, IBM System/370
Database / Development: Hadoop Big Data, Teradata, Fast Load, Multi Load, Oracle 10g, ODBC, SQL, PL/SQL, SQL Server 2000, DB2 UDB,SQL*Loader and SQL Plus, Export/Import
Languages: MDX, Cobol, JCL, CICS, BTEQ, Java
BI Tools: DataStage8.x/7.X/6.X, Parallel Extender (Manager, Designer, Director, Administrator), Ab Initio Co Op 2.14, COGNOS, MicroStrategy, OWB (Oracle Warehouse Builder), MDB, DMExpress SYNCSORT 4.2 (Change Data Capture Tool), Informatica Data Quality (IDQ)
Modeling and Project Management Tools: ERWIN 3.5, TOAD, SQL Developer, MS Project 2000 (Project Scheduling, Resource Allocation, Tasks, Sub Tasks, Charts - Certified)
Data Warehousing / Methodologies: Ralph Kimball, Bill Inman and IBM Data Warehousing Methodologies
Business Process Control: Business Discovery, Process Definition, Process Cycle, Data Interaction
Application Design and Development: Data Warehousing and Modeling using Multi Dimensional (Star/Snowflake) and E-R approach in a large Enterprise Data Warehouse (EDW) applications, Operational Data Stores (ODS), Data Marts, Multi Dimensional Data Cube Design for various Industries (Financial, Insurance, Mortgage and Banking).
Reporting & Administration: OLAP, ROLAP, HOLAP and MOLAP using COGNOS 8.0, Micro Strategy.
Functional Management and Leadership Skills: Worked as a Data Analyst, Project Manager, Technical Lead, Consultant, Senior Data Warehouse Developer and Senior Applications Developer wif activities such as resource allocation based on projections, coordinating tasks & status wif onsite/offshore PM, providing technical support to team members, induction training
PROFESSIONAL EXPERIENCE
Confidential
Project Manager Ab Initio(EDW)/Hadoop
Responsibilities:
- Spearheading responsibility for managing the gamut of operations pertaining to multiple DWH projects
- Managing Development, Production Support and Operations projects wif onshore and offshore model
- Knowledge project lifecycle and methodologies and implement as per the project requirement
- Involved in business requirements gathering and estimations
- Working towards to meet SLAs as per SOWs
- Act as the liaison between the Customers, Business Analysts, Development teams and Support Teams
- Changing the priorities, Tracking, Managing Deliverables
- Responsible for planning and coordinating the resources to deploy a major Release wifin the predicted cost, time and quality estimates.
- Coordinating for planning and controlling the movement of Releases to test and live environments and make sure correct components are released
- Responsible for delivering a particular service wifin the agreed service levels and to ensure that standards and guidelines are followed
- Maintain, manage and document all project reports, statements and presentations as per stakeholder requirements
- Ensures that the teams are able to gather, analyze, store and share knowledge/information.
- Delivered POC’s for Big Data technology adoption exhibiting process speed enhancements.
- Extensively associated wif Confidential center of excellence group of Big Data.
Environment: PeopleSoft, Ab Initio, Map Reduce, PIG, Hive, Scala, Spark, Oracle, MS Project 10, Nateeza, Unix, Windows
Confidential - Charlotte, NC
Data Analyst
Responsibilities:
- Gatheird inputs from architects & technical resources and prepared wif technical solutions and effort estimation.
- Prepared high level work breakdown structure (WBS)
- Prepared responses for Vendor Pool/Pre Qual Opportunities
- Involved in end-to-end proposal development that includes technical solution, estimation and pricing.
- Performed Database testing and the Report level testing as per the requirement wif excellent knowledge in understanding the data workflow by referring through FSD’s (Functional Specification Document).
- Excellent understanding of mapping between Source and Target by referring the mapping document.
- Performed end to end mapping testing for database as well as reports. The type of mapping involved is one to one and its lift and shift process that means need to check whether the data gatheird in target table is mapped properly to the Source table and the same target table is populating the same records into the report tool (SAP-BO, Qlikview) properly.
- Performed Smoke test to do the primary checks like record counts, Column matching for database and Dashboard testing.
- Performed testing at SIT (System Integration Testing) level and UAT (User Acceptance testing) level.
- Gatheird requirements from the development team and database developers to analyze the tables and entity relationships for understanding the database.
- Designed the integration document/xls. Derive input and output of each of the integration points.
- Documented the acceptance criteria for each of the test cases. Built the test cases based on test scenarios.
- Created test plan and strategy for the given LOB (Line of Business).
- Verified of import/export and obfuscation data.
- Verified of known issues, development of work arounds and wrappers as required.
- Identified data scenarios and business cases. Created test case development.
- Scripted and automated test cases. Identified source data pattern and reports.
- Developed scripts for comparison wif target. Planed and run the SIT (System Integration Testing) for the given LOB.
- Developed test cases, established traceability between requirements and test cases.
- Performed data analysis to determine the completeness and accuracy of the data or checked if new data needs to be pulled up/ requested.
- Executed test cases & log results, performed data validation testing as appropriate, tracked defects and participated in defect resolution.
- Provided inputs to the test lead for documentation and reporting purposes. Identified, documented and updated testing dependencies and participants.
- Identified primary point of contact to raise the risks/issues around testing dependencies.
- Reported status on test execution including risks/issues and targets. Updated latest information in regular testing status meetings wif all involved constituencies to ensure smooth test execution and timely issue resolution.
Environment: Data Stage, Ab Initio, HP-ALM, SharePoint, MS-Visio, MS-Excel, Teradata SQL Assistant, Qlikview, SAP-BO, Oracle 11g, Microsoft SQL Server.
Confidential, Detroit, MI
Data AnalystResponsibilities:
- Over see projects and serve as point of escalation to ensure project objectives, benefits, budgets and time lines are met or exceeded.
- Assist and support project team members in completing the tasks.
- Mitigated the issues among team members.
- Motivated the team members.
- Review, asses and evaluate execution of the projects on a regular basis.
Confidential, Brooklyn, NY
Data AnalystResponsibilities:
- Conducted interviews wif technical analysts and business analysts regarding interfaces.
- Prepared current interface documents me.e. specifications, flow diagrams and mapping spread sheets
- Performed data integrity checks and prepared exceptions reports using Data Stage ETL tool.
- Designed new interfaces and helped developers wif respect to Data Stage jobs.
- Involved in Operations Data Store (ODS) design and deployment activities.
- Migrated data from various legacy systems to staging area.
- Help is extended for complex data issues and also resolved vendor specific issues.
Environnent: IIS (Data Stage 8.01- Enterprise Edition), ERWIN, Z/OS, Z-Linux, SharePoint, DB2, Web Services, MS Project 2005
Confidential
Tech Lead/ Enterprise Integration Analyst- SAP/IIS (DataStage) (OPTECH)
Responsibilities:
- Wrote and reviewed technical requirements for reusable components of translation, preprocessing, derivation, validation, aggregation, technical reconciliation key, code combo key, load ready file for ECC & BI.
- Lead the team of 12 people and helped senior management wif respect to status tracking and updates using detailed project plan.
- Helped in design and development of interfaces for batch as well as manual journal entry posting.
- Leveraged the enhanced features of Data Stage enterprise edition wif respect to parallelism, web services etc.
- Helped in preparing sample data wif the help of business team.
- Prepared global integration design documents as high level design.
- Prepared logical data integration model (LDIM) as detail design.
- Participated in brain storming sessions of functional data model (FDM) and physical data model (PDM) for ECC as well as BI.
- Reviewed complex SQL queries and helped in performance tuning.
- Helped in key data elements identification for both relational data model (ECC) and dimensional model (BI).
- Participated in code block and extended code block elements working sessions.
- Participated in peer reviews of functional specs.
- Designed currency split, document split, and technical reconciliation key assignment for the process of transforming data into load ready documents.
- Participated in peer reviews for approving the functional requirements.
- Participated in code reviews of Data Stage, Java and Quality Stage ETL jobs.
- Delivered the common source template for Type-1(Balanced journal entry), Type-2(Single journal entry) source files.
- Designed and helped in development of data quality audit tracker me.e. automatic error handling process.
Environnent: IIS (Data Stage 8.01- Enterprise Edition), ERWIN, SAP ECC, SAP BI, SAP MDM, Z/OS, Z-Linux, PVCS, DB2, Web Services, ABAP, Java, XML, Autosys, Excel, Visio, E-Cub, MS Project 2003
Confidential, Atlanta, GA
Onsite Coordinator
Responsibilities:
- Understanding the existing environment, load processes and system standards
- Requirements gathering, technical design.
- Involved in Preparation of Functional Specifications, Data Dictionary, Project Plan, source to Target mapping Document.
- Responsible for design, development, implementation and tuning of Logical and Physical Data Model and ETL process for Policy Delivery Data Mart. Involved in Design of Fact, Aggregate tables and modification of confirmed dimensions along wif data analyst team.
- Created DataStage jobs to load staging tables, Dimension tables, Fact tables using different transformations like Source Qualifier, Aggregator, Filter, Expression, Lookup, Update Strategy, Router and Joiner transformation etc.
- Responsible for development and support of ETL routines required for populating the data warehouse, Experience in loading high-volume data, Tuning and troubleshooting of mappings.
- Worked on COGNOS frame work model to design reports.
- Scheduled the Loading processes and monitored sessions.
- Unit testing and System Integration Testing, Coordinate wif offshore for tasks & status.
Environment: DataStage 7.5.1, TOAD, SQL, PL/SQL, ERWIN 3.5, COGNOS 8.0