Etl Solution Lead Resume
4.00/5 (Submit Your Rating)
InformaticA
SUMMARY
- Over Twelve plus years of full life cycle experience in analysis, design and development of Data migration and Data Warehousing applications.
- Translates complex business requirements into Technical Specifications for a highly scalable, large volume enterprise product or solution.
- Drives high - level architectural planning and proof-of-concepts to support important enterprise initiatives.
- Provides inputs to Enterprise Architecture regarding the data volumes, sizing metrics and concurrent end-user activity estimates to compute a capacity plan.
- Establishes standards and best practices for application implementation, tools and technologies.
- Maintains a repository of common or frequently used templates and promotes the usage of these where applicable.
- Data warehousing experience using Informatica with extensive experience in designing the Workflows, Worklets, Mappings, Reusable Mappings, Configuring the Informatica Server and scheduling the Workflows and sessions using Informatica PowerCenter 9.x/8.x.
- Extensive Data modeling experience using Dimensional Data modeling, Star Schema, Snowflake schema, and FACTs and Dimensions tables.
- Well versed with the Offshore-Onsite project, Cross vendor work model & Service Management Activities.
- Thorough knowledge of the different stages of a project and aptitude for meeting deadlines, recognizing potential bottlenecks and taking the required corrective action.
- Managing end-to-end service level: controlling that the service meets the agreed performance in terms of the SLA and Operational Level Agreement (OLA).
- Experience in 4 full Life Cycle Implementations, production supports, enhancements and roll out projects.
- Experience in designing and implementing end-to-end BI solution for organizations.
- Strong Exposure in writing Simple and Complex SQLs, PL/SQL Functions and Procedures, Packages and creation of Oracle Objects - Tables, Materialized views, Triggers, Synonyms, User Defined Data Types, Nested Tables and Collections.
- Proficient in writing complex stored procedures, Normalization, Database Design, creating Indexes, Functions, Triggers, Sub Queries
- Worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files.
- Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer and Query man.
- Has been involved in production and user support for Informatica related issues.
- Installed, configured, administered, upgraded and managed the Informatica 9.6.1 multi node grid environment with High Availability.
- Experience in creating Power Center Domain, Grid, Nodes and Services.
- Experience in security administration creating and managing users, groups and roles and assigning privileges.
- Involved in designing and implementation of a Metadata Repository.
- Used Informatica Metadata Manager exhaustively to maintain and document metadata.
- Reviews resource estimates for development projects, performs design reviews and assesses contingency/disaster recovery plans.
- Enforces project/solution architectural integrity with appropriate change control, design reviews, development standards and code reviews.
- Handled 8-10 developers in an offshore and onsite model.
- Involved in Disaster Recovery Exercise, Data Archival process for Informatica Infrastructure.
- Designed and implemented projects in Google Cloud Analytics, BigQuery, GCP.
- Worked as an Agile Scrum master to execute projects using Features, User stories etc.,
- Good knowledge of Error Handling and Recovery strategies in Informatica ETL environments.
- Analytical, methodical, and resourceful approach to problem solving, identifying and documenting root cause analysis and corrective actions to meet short- and long-term business and system requirements.
- Excellent verbal and written communication skills with ability to interact at all levels of management.
PROFESSIONAL EXPERIENCE
ETL Solution Lead
Confidential
Responsibilities:
- Playing the role of ETL Solution Architect and is responsible for designing the ETL Strategy & Architecture of the Project.
- Drives high-level architectural planning and proof-of-concepts to support important enterprise initiatives like GCP implementation, Google Cloud Analytics.
- Translates complex business requirements into Technical Specifications for a highly scalable, large volume enterprise product or solution.
- Communicate with business users and source data owners to gather reporting requirements and to access and discover source data content, quality, and availability.
- Design automated processes to capture data quality and reasonability parameters of loaded data and for comparing those against baseline versions of data.
- Well versed with Both Agile (Scrum and Kanban) and Waterfall.
- Used JIRA and RALLY as Agile tool.
- Technical Point of Contact for Salesforce OLS project.
- Designed and implemented projects in Google Cloud Analytics, BigQuery, GCP.
- Worked as an Agile Scrum master to execute projects using Features, User stories etc.,
- Provides inputs to Enterprise Architecture regarding the data volumes, sizing metrics and concurrent end-user activity estimates to compute a capacity plan.
- Used Informatica Metadata Manager exhaustively to maintain and document metadata.
- Accessed and communicated the impact of planned maintenance activities;
- Created the test environment for Staging area, loading the Staging area with data from multiple sources.
- Provided customer training on web portal tools, fault management and escalations procedures, and product offerings.
- Captures metadata in the Metadata Tool Repository after the project has been delivered.
- Used Metadata repository for impact analysis, to find data lineage and documenting business glossary.
ETL Technical Consultant
Confidential, Plymouth Minnesota.
Responsibilities:
- Played the role of ETL Technical Consultant and is responsible for ETL Design & Data Architecture.
- Work on different projects like Pharmacy Benefit Management, Finding Fraud Errors on processed claims, Sun setting databases which are not in use, Building Data Marts, Enhancing exiting data warehouse applications.
- Specialized in performance tuning of projects before deploy to production.
- Governing Technical design and code reviews meetings to improve industry standard, best practices and creating enterprise level knowledge base repository for reusability.
- Translates complex business requirements into Source to Target Mapping (S2T) for a highly scalable, large volume enterprise product or solution.
- Used data profiling techniques to understand data quality.
- Communicate with business users and source data owners to gather reporting requirements and to access and discover source data content, quality, and availability.
- Design automated processes to capture data quality and reasonability parameters of loaded data and for comparing those against baseline versions of data.
- Provides inputs to Enterprise Architecture regarding the data volumes, sizing metrics and concurrent end-user activity estimates to compute a capacity plan.
- Written PL/SQL stored procedures and Triggers to retrieve relevant information from various tables into intermediate tables.
Technical Environment: Informatica PowerCenter/Exchange 10, Oracle, SQL Server, JSON/XML format files, Flat Files, Salesforce, UNIX, Toad, Automic Scheduler.
ETL Technical Lead/ Architect
Confidential
Responsibilities:
- Played the role of a Technical Architect/Lead and is responsible for designing the ETL Strategy & Architecture of the Project.
- Drives high-level architectural planning and proof-of-concepts to support important enterprise initiatives.
- Translates complex business requirements into Technical Specifications for a highly scalable, large volume enterprise product or solution.
- Communicate with business users and source data owners to gather reporting requirements and to access and discover source data content, quality, and availability.
- Design automated processes to capture data quality and reasonability parameters of loaded data and for comparing those against baseline versions of data.
- Provides inputs to Enterprise Architecture regarding the data volumes, sizing metrics and concurrent end-user activity estimates to compute a capacity plan.
- Used Informatica Metadata Manager exhaustively to maintain and document metadata.
- Accessed and communicated the impact of planned maintenance activities;
- Created the test environment for Staging area, loading the Staging area with data from multiple sources.
- Provided customer training on web portal tools, fault management and escalations procedures, and product offerings.
- Captures metadata in the Metadata Tool Repository after the project has been delivered.
- Used Metadata repository for impact analysis, to find data lineage and documenting business glossary.
- Involved in end to end POC from Installing the product to talking to business unit to gather requirements on product information
- Designed the ETL flow for extracting data from legacy systems.
- Installed and configured Informatica grid and high availability.
- Created mappings in Informatica to process the source files and applied business logic rules on the data to load into the warehouse.
- Expertise in Data Modeling, Designing of Logical and Physical Data Models.
- Maintained effective communication with non-technical client personal and handled the change requests.
- Enforces project/solution architectural integrity with appropriate change control, design reviews, development standards and code reviews.
- Created Unix Shell scripts to automate the data load processes to the target Data Warehouse.
- Coordinated tasks with onsite and offsite team members.
- Provided 24x7 production support. Trained end users on application.
- Establishes standards and best practices for application implementation, tools and technologies.
Technical Environment: Informatica PowerCenter/Exchange 9.6.1, DB2 on Mainframe, VSAM, Shell Scripts, IBM-AIX, Oracle, Amazon RedShift, REST/SOAP, JSON/XML format files, Salesforce, Microstrategy.