We provide IT Staff Augmentation Services!

Lead Data Consultant Resume

5.00/5 (Submit Your Rating)

MN

SUMMARY:

  • Over 9 Years IT experience providing technical leadership on ETL/ Data Warehousing, Data Integration, Data Acquisition, Data Migration, Data Modeling, Data Profiling, Data Analysis, Data Cleansing, Data Quality, Data Processing, Data Marts including Implementation, Maintenance, Testing, and Production Support of Applications.
  • Experience in all aspects of project development life cycle including Analysis, Design, Development, Implementation, Modeling, Testing, Deployment and supporting for various projects in consistent with business requirements and specifications to deliver business value.
  • Expertise all activities related to development, upgrade, migrate, implement and support of ETL processes for large - scale data warehouses and in complex, cross-departmental projects using Datastage11.3/8.5.
  • Extensive experience with Data Extraction, Transformation, and Loading (ETL) from multiple data sources Oracle, DB2, SQL Server , XML, Flat files into Analytical and Enterprise Data Model.
  • Expertise in implementing real-time and near real-time data movement using IBM Infosphere Change Data Capture ( CDC ) and created subscriptions for Oracle, SQL Server and DB2 Database Tables.
  • Experienced in data standardization, data modeling and dimension modeling using ER diagram (i.e. conceptual, logical and physical model design, experience with ODS, DWH and DM )
  • Expert in using Parallel Extender, developing parallel jobs using stages, which includes Lookup, Join, Transformer, Sort, Merge, Funnel, Java Integration, CDC Transaction, XML, MQ, Teradata, SAP ABAP extract, SAP BW Load stages efficiently.
  • Expertise in Architecture, Database Design, Development of BASIC Routines, SQL, PL/SQL Stored procedures, Packages and Functions, Query and Jobs tuning for better performance.
  • Developed and implemented the best practices/methodologies in ETL Tool, Datastage (naming conversions, mapping standards, versioning, etc.) to ensure consistency across the organization.
  • Participated in Design walk-throughs, Test case reviews with attention to performance, stability, scalability and testability and provide inputs for Go/No Go decision.
  • Creating effort/duration estimates and deliverable lists for proposed project and guiding, training and assisting others on a variety of technical issues.
  • Experienced in writing Unix Shell Scripts for the automation of ETL processes and using schedulers like TWS and Autosys and operating systems WindowsXP, Unix-AIX, Sun Solaris, and Linux.
  • Exposure in writing BTEQ scripts and loading Teradata using FASTLOAD, MLOAD and TPUMP utilities.
  • Well versed with heavy weight Software development methodologies like Waterfall and Agile / SCRUM methodology.
  • Exposure to Global Delivery Model (GDM) and experience in coordination of projects in an onsite-offshore model involving multiple vendors and cross functional teams
  • Dedicated, self-motivated achiever who is committed to success and adept at handling multiple tasks in a high-pressure environment.
  • Excellent communication skills and ability to work in-groups as well as independently to recommend the project priorities and deadlines and develop the solutions in global projects.

TECHNICAL SKILLS:

ETL Technologies: Infosphere Datastage 11.3/8.5/7.5 , Informatica Power Center 7.x IBM Infosphere CDC 10.3/6.3,Talend DI, Talend

OLAP: Business Objects XI, Cognos 7.x

Operating Systems: Windows NT 4.0/XP/2000/98, UNIX, Linux

RDBMS/DB Tools: Oracle 11g, SQL Server 2005, DB2, Teradata, SQL Developer, TOAD, Erwin 4.1.

Languages: SQL, PL/SQL, Java, BASIC, Unix Shell Scripting

Scheduling: TWS, Autosys, Control-M

Data Modeling: E/R Modeling, Dimensional Modeling (Star, Snowflakes Schema Modeling)

Version Control: PVCS Dimension, MSTFS, SVN, Clear Case.

PROFESSIONAL EXPERIENCE:

Confidential, MN

Lead Data Consultant

Responsibilities:

  • Develop conceptual, logical and physical design for various data types and large volumes.
  • Architect, design and implement high performance large volume data integration processes, database, storage, and other back-end services in fully virtualized environments.
  • Work closely with customers, at a technical and user level, to design and produce solutions.
  • Work closely with the product management and development teams to rapidly translate the understanding of customer data and requirements to product and solutions.
  • Participate in Rapid Application Development and Agile processes to deliver new cloud platform services and components.
  • Determines organizational strategies for data integrity validation processes . Establishes policies and best practices for optimizing ETL data throughput/accessibility
  • Supervises the department's processes for data construction and design .
  • End to End data design for User data requirements.
  • End to end ownership of foundation data build up, reporting capability build out, Report/User Migration from legacy system. Advises on the advanced functions and operational techniques of data administration tools and systems .
  • Build driver process using keys to efficiently load the data.
  • Also involved in building publication process and published data from stage and integration layer using push and pull method to external targets.
  • Designs data access, usage and change rules for users within and across organizations
  • Working on Proof of concept for Switching Data Processing from Teradata to Hadoop
  • Designed ETL jobs for extracting, cleansing, transforming, integrating, and loading data into different Data Marts using Data Stage Tools. This includes error processing, reprocessing logic, count match logic, and restart ability logic.
  • Wrote system specifications by collecting business requirements and translating user requirements in to technical specifications.
  • Documented mapping guidance and standards to enable consistency and training team members
  • Identified/documented data sources and transformation rules required to populate and maintain Data Warehouse content.
  • Developed Talend jobs to manage the master data for single or multiple domains - customers, patients, member, locations, provider domains .
  • Designed ETL jobs for extracting from Hive snapshot tables, transforming, loading data into different stage/Base tables using Talend jobs
  • Involved in data warehouse project tasks such as Planning, Requirement gathering, Identifying sources, allocation of resources, execution of projects and planning of implementation activities.
  • Worked extensively on different stages such as Sequential file, lookup, Aggregator, Transformer, Join, Pivot, Sort,Filter,DB2/UDB Enterprise for developing jobs.
  • Performance tuned ETL jobs as part of simplification of Enterprise Data warehouse.
  • Developed Job Sequences and scheduling jobs based on dependencies using Control M.
  • Imported metadata from repository and exporting/importing projects across various platforms.
  • Designed reusable logic utilizing Shared Containers in DataStage Designer.
  • Used DataStage Director to validate jobs before compilation and debugged its components by monitoring the resulting executable versions.
  • Defined partitioning/Indexing strategy for Data Mart tables to enhance performance during reporting and analytics.

Environment: Talend, Hive, Datastage 11.3, IBM CDC, Erwin 4.2, Teradata, Windows XP, UNIX, TWS, AQT, Team Foundation Server.

Confidential, MN

Senior Lead Data Consultant

Responsibilities:

  • End-to-end implementation ETL Grid Migration data stage 8.5 to 11.3.
  • End-to-end implementation of TWS from single node to 4 nodes.
  • This involves various phases in the project for testing different patternsFlat File, Windows Compressed File, ASCII Zipped File, EBCDIC File, Database-Direct Access(Oracle),Database-Direct Access(DB2),JDBC Access, MRR Data File Creation, DB Replication-Logminer, DB Replication-QREP,MQ,XML.
  • Played key role in Impact analysis and identify each every and stage and file pattern process cover under 11.3 testing as part of migration effort.
  • Involved in resolving XML, Data Connector issues from old to new servers and identified many driver installation issues in new server and helped Admin teams.
  • Identified DB2 Transaction Stage (DTS) functionality is not working properly in 11.3 and raised PMR with IBM, Identified Similar kind of issue in XML Parser functionality (Chunk Connections) in 8.5 in 2011 and IBM resolved that issue 9.1 version.
  • Involved in QA validation for Extract files, data base table loads from 8.5 server output Vs 11.3 outputs

Confidential, MN

Lead ETL Developer

Responsibilities:

  • Building ETL frame working and design/Migrating Reporting capabilities.
  • End to End ETL and data design for User data requirements.
  • End to end ownership of foundation data build up, reporting capability build out, Report/User Migration from legacy system.
  • Source system analysis and Logical data mapping activity and supporting data modeling activity.
  • Physical data model Design for reporting business View and data Extraction from UDW as per business requirements.
  • End user mobilization, All SIT, UAT and BAT sign off.
  • Managing the team of ETL analysts and Project lead for Driving Release for a Successful deployment.
  • Guiding team on performance enhancements of reporting SQL of Business object Data provider.
  • Monitored and trouble shoot Fast load, MLoad, Fast export for loading data from Oracle to Teradata and data export.
  • Working with multiple stakeholder with the in the project to ensure integrated project is moving in right direction.
  • Provides technical consulting and guidance to development teams for the design and development of highly complex or critical ETL architecture.
  • Identifies opportunities for new architectural initiatives; makes recommendations on the increasing scalability and robustness of ETL platforms and solutions.
  • Evaluates new tools and techniques and makes recommendations to improve real-time and batch data access, transformation and movement across heterogeneous technologies and platforms.
  • Determines organizational strategies for data integrity validation processes. Establishes policies and best practices for optimizing ETL data throughput/accessibility
  • Advises on the advanced functions and operational techniques of data administration tools and systems.
  • Monitors data quality, compatibility and stability to meet current and future needs of the organization.
  • Designs data access, usage and change rules for users within and across organizations
  • Deliver data integration and reporting solution by building Unified Data Warehouse and a portal for users to allow navigation to standard canned and ad-hoc reporting solutions and dashboards.
  • Involved Data profiling of the source system, requirement gathering, Data modeling, Design analysis, preparation of High level Design, Detailed level Design and Mapping documents.
  • Deliver data integration and reporting solution by building Unified Data Warehouse and a portal for users to allow navigation to standard canned and ad-hoc reporting solutions and dashboards.
  • Data profiled using SQLs to identify the source and establish integration points across different subject areas which was input to requirement and data modeling activities.
  • Created the Logical and Physical data model in the Erwin by working closely with the SME in the source system.
  • Created the High level design and Source to Target mappings with the Transformation rules for every attribute which will be used by developers to start the development.
  • Played SME role both Technical, Testing and functional related questions.
  • Designed ELT/ ETL Patterns to capture near real time/Batch data and established critical paths.
  • Created the Semantic design to meet both Business and performance needs.
  • Provided general design consultation regarding Best Practices and EDW approaches
  • Designed patterns for history/incremental loads.
  • Designed ETL jobs for extracting, cleansing, transforming, integrating, and loading data into different Landing table/Foundation tables.
  • Coordinated with testing team for System testing, UAT and Regression phases; reviewed test cases and test strategy.
  • Interacted with business community and gathered requirements based on changing needs. Incorporated identified factors into ETL.
  • Involved in project effort estimation, tracking, schedule adherence, finalization of technical/functional specifications and status reporting to higher management
  • Facilitated requirement workshops and coordinated functional/technical trainings in the project to increase the business knowledge of team.
  • Developed jobs to perform ETL operations to load the data to UDW from different sources.
  • Involvement in various test phases such as Functional, System, Regression, Performance and UAT.
  • Developed UNIX Shell scripts for File Manipulation, FTP and archiving log files.
  • Efficiently implemented Change Data Capture (CDC) to extract data based on supplementary log files using IBM Infosphere CDC technology.
  • Prepared the technical specifications using use cases and maintained versions to functional changes.
  • Involved in monitoring of production ETL applications, Production Support through tickets. Executing Scripts / Managing scheduled activities based on business requirements.
  • Worked on optimizing and performance tuning on targets, sources and clearing of mutex in SAP BW to increase the efficiency of the process.
  • Involved in writing TWS jobs for scheduling the jobs.
  • Turnover the artifacts and knowledge transitioning to L2/L3 and support teams.
  • Worked as deployment a captain for the different releases coordinating the planning, review tracking status and scheduling of the jobs and communicate the status to leadership team

Environment: Informatica, IBM CDC, Erwin 4.2, Teradata, Windows XP, UNIX, TWS, AQT

Confidential, MN

Senior ETL Developer

Responsibilities:

  • Impact Analysis on interface/component level and to create detailed analysis report.
  • Involved in setting up the ETL environments and connectivity between the multiple applications.
  • Compatibility testing for all the Databases and external plug ins (Ex: MQ).
  • Standardize the ETL jobs as per coding standards including removal of hard-coded directories, database connects, Security, FTP remediation, Purge Process. Scheduling and Grid Enablement.
  • Reviewed code and streamlined the unit test cases with proper testing results for all the ETL projects.
  • Identified the TWS details and verify stage TWS sync with Prod TWS
  • Turnover the artifacts and knowledge transitioning to EAM/support teams.
  • Technical lead accountable for Data Integration and SPOC for all technical issues related to coding, design, migration, and implementation.
  • Performed problem analysis, identified root causes and outlined resolution options.
  • Effort estimation and tracking, schedule adherence and status reporting to the clients.
  • Design, Construction, testing and deployment of ETL components in Datastage.
  • Defined data architecture for a smoother implementation by identifying synergies between various sources and developed the routines and Maintained Issue log and Incident log.
  • Production deployment, monitoring the schedule, pre implementation planning and resolving post implementation issues and emergency production recoveries.
  • Coordinated with multiple teams across various streams, locations to integrate the project activities.
  • Involved in writing TWS scripts for scheduling the jobs and conducted functional, system and performance testing and supported UAT.
  • Lead team in completion of standard development documentation and processes.

Environment: Data stage 8.5, Db2, Windows XP, UNIX, TWS, IBM Command Center, AQT, and Team Foundation Server.

Confidential, MN

Senior Informatica Developer

Responsibilities:

  • Participated in Requirement gathering and Business meetings, understand the requirements and translated into Technical Design Document.
  • Effort estimation and tracking, schedule adherence and status reporting to the clients.
  • Design, Construction, testing and deployment of ETL components in Informatica.
  • Developed the routines and Maintained Issue log and Incident log.
  • Wrote the shell scripts to FTP the feeds from different providers.
  • Maintained the version using PVCS dimensions and deployment using WBSD.
  • Involved in Data modeling and creation of the physical database tables.
  • Conducted functional, system and performance testing and supported UAT phase with clients.
  • Helped improve the production batch cycles and automate the data reconciliation process
  • Designed ETL jobs for extracting, cleansing, transforming, integrating, and loading data into different Data using Informatica Tools. This includes error processing, reprocessing logic, count match logic, and restart ability logic.
  • Wrote system specifications by collecting business requirements and translating user requirements in to technical specifications.
  • Documented mapping guidance and standards to enable consistency and training team members
  • Developed Autosys code to Schedule Informatica jobs on a periodic basis.
  • Extensively used sql to unit test and validate proper functioning of Informatica code.
  • Using Microsoft Visio Designed the ETL flow. Created standards for design document, which was used all across the team.
  • Cron Jobs/Shell Scripts were written to schedule the Informatica jobs.
  • Involved in performance testing.
  • Worked on Quality Stage for standardizing the data.
  • Created documentation for each phase different stages for e.g. turnover, initial design, final design, version control, implementation, planning, unit testing & performance testing documentation.
  • Used Rational Clear case and Clear Quest as part of change management process.
  • Identified/documented data sources and transformation rules required to populate and maintain Data Warehouse content.
  • Involved in data warehouse project tasks such as Planning, Requirement gathering, Identifying sources, planning, allocation of resources, execution of the projects which includes planning of implementation activities.
  • Worked extensively on different stages such as Sequential file, lookup, Aggregator, Transformer, Join, Pivot, Sort,Filter,DB2/UDB Enterprise for developing jobs.
  • Importing metadata from repository and exporting/importing projects across various platforms.
  • Developed Job Sequences and scheduling jobs based on dependencies using Control M.

Environment: Data Stage 7.5 PX, DB2 UDB, AIX, UNIX Shell Scripting, SQL, AQT, TWS

Confidential

ETL Developer

Responsibilities:

  • Designed ETL jobs for extracting, cleansing, transforming, integrating, and loading data into different Data Marts using DataStage Tools. This includes error processing, reprocessing logic, count match logic, and restart ability logic.
  • Wrote system specifications by collecting business requirements and translating user requirements in to technical specifications.
  • Did performance testing to show advantages of parallel extender jobs.
  • Created jobs using transformer, join, change capture, duplicate, merge and look up stages.
  • Developed UNIX shell scripts to automate the Data Load processes
  • Created design documents for all jobs.
  • Preparing documentation for unit, integration, and final end - to - end testing.
  • Involved in performance tuning of DB2 programs for better data access.
  • Incorporated Restart logic in data processing programs.
  • Documented mapping guidance and standards to enable consistency and training team members
  • Identified/documented data sources and transformation rules required to populate and maintain Data Warehouse content.
  • Involved in data warehouse project tasks such as Planning, Requirement gathering, Identifying sources, allocation of resources, execution of projects and planning of implementation activities.
  • Worked extensively on different stages such as Sequential file, lookup, Aggregator, Transformer, Join, Pivot, Sort,Filter,DB2/UDB Enterprise for developing jobs.
  • Performance tuned ETL jobs as part of simplification of Enterprise Data warehouse.
  • Developed Job Sequences and scheduling jobs based on dependencies using Control M.
  • Imported metadata from repository and exporting/importing projects across various platforms.
  • Designed reusable logic utilizing Shared Containers in DataStage Designer.
  • Used DataStage Director to validate jobs before compilation and debugged its components by monitoring the resulting executable versions.
  • Defined partitioning/Indexing strategy for Data Mart tables to enhance performance during reporting and analytics.

Environment: Data Stage 7.5 PX, DB2 UDB, AIX, UNIX Shell Scripting, SQL, AQT, Control-M

Confidential

DataStage ETL Consultant

Responsibilities:

  • Developed jobs to extract, transform and load the data from source to Staging Area and then to DW tables.
  • Used DataStage Manager to export, import the DataStage jobs.
  • Coordinating with the business partners on the changes and issues.
  • Developed shared container jobs and routines
  • Developed and executed the Unit Test Cases.
  • Supported User Acceptance Testing (UAT) in data loading
  • Involved in Preparing Mapping Documents, System Testing and Regression Testing.
  • Involved in Performance Tuning and logging error messages and fix the errors.
  • Creating Shell Scripts in automating data load jobs.
  • Prepared the Visio diagrams and wrote Control M scripts for scheduling the jobs.
  • Involved in preparation of checklist, documents along with the objects for deliverables.

Environment: Datastage 7.5, DB2UDB, Windows XP, UNIX, TWS, PVCS Dimension

We'd love your feedback!