Etl Lead/architect Resume
PhoeniX
SUMMARY
- Around 13 years of Software experience in Client/Server Technologies and Internet applications of which
- 7.4 years in the Design, Development and Implementation of Business Intelligence Projects.
- 6.0 years as a Project Leader role in Business Intelligence Projects.
- Extensive experience in working with the North American banking industry. Strong knowledge of financial instruments with regulatory experience, reporting and analytical skills.
- Extensive experience in Business Intelligence projects with working knowledge on to Extract, Transform and Load (ETL) data into the Data warehouse.
- Conversant with all phases of Software Development Life Cycle (SDLC) involving Systems Analysis, Design, Development, and Implementation. Experience in managing and working with Onsite - Offshore teams.
- Expertise in Risk data management for submitting CCAR schedules to Federal-CCAR and Regulatory Reporting.
- Extensive experience in Extraction, Transformation, Loading data from various sources into Data Warehouses using Informatica Power Center, Power Exchange 7.1.3, 8.6, 9.5.1.
- Experience with Master Data Management (MDM) and Informatica MDM tool -Informatica MDM Hub Console, Hierarchy Manager (HM) and Informatica Data Director (IDD).
- Extensively worked on Data Management Projects that involves data profiling, data staging, data cleansing and data migration.
- Strong Data analysis background with experience in gathering, analyzing and documenting the requirements.
- Identifying the KDE (Key data elements) for the defining the business rules for the data quality and working with Line of Business and data stewards.
- Good Understanding in using highly scalable parallel processing infrastructure using Datastage Enterprise Edition, and using DataStage Manager to import metadata, and create new data elements.
- Implementation of ETL Strategy, which optimizes Mapping Performance.
- Experience of supporting existing data-warehousing systems and resolving production issues.
- Experience in UNIX shell scripting, CRON, FTP and file management in various UNIX environments
- Created ETL mappings using Informatica Power Center to move Data from multiple sources like XML, SQL server, Flat files, Oracle into a common target area such as Staging, Data Warehouse and Data Marts.
- Designed developed and implemented Master Data Management solution Customer Party MDM Domain using MDM 10.1.
- Experience on Teradata SQL and Utilities - BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD, TPump.
- Thorough understanding of Business Intelligence and Data Warehousing Concepts with emphasis on ETL
- Well versed with all Phases of SDLC (Software Development Life Cycle) Process with respect to development, deployment, maintenance and enhancements especially agile and scrum prototypes.
- Strong experience in database and ETL/ELT Admin tasks.
- Excellent communication and documentation skills.
- Ability to multi-task and meet deadlines in a fast-paced, dynamic environment.
- Ability to train the end users to get adapted to the newly developed system and possess excellent skills to build client interface and rapport with Client Base.
TECHNICAL SKILLS
ETL Tools: Informatica Power Center 7.1,8.1.6,9.1 and Data stage,SSIS,SSRS
Languages: C++, C, PL/SQL, SQL
Web Technologies: HTML/XHTML
Tools: SSRS 2008, SSIS, MS-Project, Crystal Reports, MS Visio, Excel.
Operating Systems: Windows 2000/ XP/NT/2003/2008, Vista, Linux, Windows 7
Methodologies: Agile, Scrum, Waterfall Prototypes
Third Party Tools: Peregrine (HP Service Manager 9.2/7.0)
Databases: Oracle 8.x/9x/10g/11g,Teradata,green Plum, Editors ( SQL Navigator, Toad)
Data modeling Tools: ERWIN, VISIO
Job Scheduling Tools: Autosys, Control M, Tivoli
BI Tools: BusinessObjects6.5,Cognos Report net, Dashboard Manager, OBIEE
PROFESSIONAL EXPERIENCE
Confidential, Phoenix
ETL Lead/Architect
Responsibilities:
- Working with a leading Hi-Tec Manufacturing Company in North America as a Lead member for the enterprise PIM initiative and Multi-domain Global MDM implementation.
- Involved in setting up a Product MDM infrastructure for client involving Informatica PIM tool. This is part of a bigger program that helps organization to create a unified online portal enabled by IBM WebSphere Commerce across geographies.
- Owning end to end solution comprises of ETL and database.
- Involve with other teams/business for analysis and understanding of the target and source systems and provide technical solutions and technical artifacts.
- Technical Artifacts includes Data Warehouse Architecture, Data Model, Creating the ETL Mapping document, ETL and Reports development Road Map Develop and execute functional/system test cases to ensure product meets functional and design specifications.
- Developed the Informatica - ETL Mappings to transform data from various sources as per the business logic defined.
- Involved in the testing of the system and co-ordinate the work with offshore team.
- Involved in the atomization of the complete process.
- Involved in Work Breakdown Structure (WBS) creation, Estimation, Risk Identification and Mitigation.
- Involved in Change Management and Delivery Management Activities.
Confidential, Houston, TX
ETL Lead and Data Analyst
Responsibilities:
- Performed data profiling in the source systems that are required for FTL/RITE and QTRAC Systems
- Documented the complete process flow to describe program development, logic, testing, and implementation, application integration, coding.
- Worked with internal architects assisting in the development of current and target state enterprise data architectures.
- Worked with project team representatives to ensure that logical and physical data models were developed in line with corporate standards and guidelines.
- Involved in defining the source to target data mappings, business rules and data definitions.
- Responsible for defining the key identifiers for each mapping/interface.
- Responsible for defining the functional requirement documents for each source to target interface.
- Documented, clarify, and communicate requests for change requests with the requestor and coordinate with the development and testing team.
- Involved in implementing the Land Process of loading the customer/product Data Set into Informatica MDM from various source systems
- Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
- Reverse engineered all the Source Database's using Erwin.
- Documented data quality and traceability documents for each source interface.
- Designed and implemented data integration modules for Extract/Transform/Load (ETL) functions.
- Involved in data warehouse design.
- Experience with various ETL, data warehousing tools and concepts.
- Documented the complete process flow to describe program development, logic, testing, and implementation, application integration, coding.
- Worked with internal architects in the development of current and target state data architectures
- Worked with project team representatives to ensure that logical and physical ER/Studio data models were developed in line with corporate standards and guidelines.
- Used data analysis techniques to validate business rules and identify low quality missing data in the existing Amgen enterprise data warehouse (EDW).
Confidential, Los Angeles, CA
ETL Lead
Responsibilities:
- Involved in migration of data from Oracle to Green Plum (PostgreSQL).
- Work with customers in gathering business requirements for data migration needs.
- Work across multiple functional projects to understand data usage and implications for data migration.
- Assist in designing, planning and managing the data migration process.
- Work with subject matter experts and project team to identify, define, collate, document and communicate the data migration requirements.
- Responsible for data acquisition, reconciliation, audit control & release of CCAR FR Y-14 Retail, Wholesale & Securities schedules for certifications and FED submissions.
- Responsible in Implementing Credit Risk Management and U.S. Basel III regulatory requirements.
- Understand and document detailed requirements, data validation requirements and translate business needs into process improvement solutions. Develop functional requirements, prepare the HLD & LLD and carry out impact analysis. Ensure functional specifications alignment with business requirements.
- Perform data analysis by accessing the data using queries & reporting tools, including mapping the business functions to the data, data model, data reconciliations and gap analysis.
- Design the edit check engine and implement the FRB & custom edit check rules and automate those using Autosys schedulers. Maintain Meta data and Micro Data Reference Manual.
- Lead the development team and perform the ETL design specifications, and create mappings, sessions and work-flows. Test ETL with Informatica power center and data base objects for quality assurance, and migrating releases into testing and production environments.
- Conducting code review sessions. Involved in Unit testing and integration testing. Develop test plans, cases & scripts and coordinate with QA & UAT teams. Actively involved in Informatica power center production support. Implemented fixes/solutions to issues/tickets raised by user community.
- Develop best practices, processes, and standards for effectively carrying out data migration activities. Work closely with business users, IT development teams, release management, DBA and business architecture review boards throughout the various phases of the project.
- Develop the project plan, resource plan and tasks with understanding the dependencies and critical path items. Assist with management on project plan, milestones and status reports.
- Responsible for task assignments to the on-site and offshore teams. Conduct scrum meetings, weekly and daily status tracking of the project. Prepare minutes of meetings, documenting key decisions reached and follow-up action items. Manage issues / risks on projects with timely issue escalations and suggestions for issues resolution.
Confidential, Los Angeles, CA
Data Analyst
Responsibilities:
- Work across multiple functional projects to understand data usage and implications for data migration.
- Gathered requirements, analyzed and wrote the design documents.
- Providing End-user Training and documentation for customer reporting services.
- Participated in the Analysis, Design and Development Phases of report development, performance tuning and production rollout for every report of Information Technology Department.
- Queried the databases, wrote test validation scripts and performed the System testing.
- Worked with the developers during coding and while doing the remediation of the software.
- Worked with the users to do the User Acceptance Testing (UAT).
- Document all data mapping and transformation processes in the Functional Design documents based on the business requirements
- Created Technical specifications documents based on the functional design document for the ETL coding to build the data mart.
- Created data trace map and data quality mapping documents.
- Performed Data Profiling and Data Quality.
- Extensively involved in Data Extraction, Transformation and Loading (ETL process) from Source to target systems using Informatica Power Center 9.5
- Designed and implemented report layouts according to reporting standards of the department.
- Owned the assigned reports, worked on them and updated the Report Development Scheduler for status on each report.
- Worked on daily basis with lead Data Warehouse developers to evaluate impact on current implementation, redesign of all ETL logic.
- Have taken initiative to drive the report development process.
- Worked on exporting reports in multiple formats including MS Word, Excel, CSV and PDF.
- Responsible for making the report available for scheduling or viewing on demand.
Confidential, Los Angeles, CA
Lead Developer and Data Analyst
Responsibilities:
- Understand the business requirement specifications provided by the client and translate them into ETL design specification, and create mappings, sessions and workflows.
- Involved in Data Extraction from Oracle, Flat files, Mainframe files using Informatica.
- Developed various mappings to load data from various sources (Mainframe files, flat files, oracle) using different transformations like source qualifier, Normalizer, joiner, Aggregator, Lookup, Router, Update Strategy, filter and Expression to store the data in different target tables.
- As a lead member of ETL Team, responsible for analyzing, designing and developing ETL strategies and processes, writing ETL specifications for developer, ETL and Informatica development, testing and mentoring.
- Involved in debugging the UNIX scripts according to the requirement.
- Developed mappings between source and target tables.
- Conducting code review sessions.
- Involved in Unit testing and integration testing.
- Preparing the LLD and carrying out impact analysis.
- Regular Communicating with the onsite team to get the requirements.
Confidential, Monterey Park, CA
Lead Developer
Responsibilities:
- Properties and names of ETL objects and UNIX scripts modified as per new standardization documents.
- Modified the server names and path names in the UNIX script as per Linux standards.
- Removed the short cut definitions in ETL objects and modified the ETL objects without short cut definition.
- Involved in system integration testing and data validation.
Confidential, NY
Senior. Developer
Responsibilities:
- Worked on the source extracts as required for business on the source system (Omniture)
- Worked on the shell scripts to extract the data from the source system and load them accordingly.
- Worked extensively on the analysis of the data and transforming according to the business logic.
- Involved in developing mappings according to the business requirements received from the Strategic Business analysis team.
- Worked extensively on Teradata to analyses the data.
- As a lead member of ETL Team, responsible for analyzing, designing and developing ETL strategies and processes, writing ETL specifications for developer, ETL and Informatica development, testing and mentoring.
- Participates in technical walkthroughs/ code reviews of other team member’s components, test plans and results and help them with gaps.
- Worked on formatting SSRS reports using the Global variables and expressions.
- Created parameterized reports, Drill down and Drill through reports using SSRS.
- Deployed and uploaded the SSRS reports to SharePoint Server for the end users and involved in enhancements and modifications.
- Used ETL (SSIS) to develop jobs for extracting, cleaning, transforming and loading data into data warehouse.
- Involved in the Migration of Databases from SQL Server 2005 to SQL Server 2008.
- Prepared the complete data mapping for all the migrated jobs using SSIS.
- Created databases and schema objects including tables, indexes and applied constraints, connected various applications to the database and written functions, stored procedures and
Confidential, Oaks, PA
Senior. Developer
Responsibilities:
- Requirement gathering from business owners or users.
- Understand and analyze the data sources, type of data and prepare a list of data elements.
- Generation of Functional Specifications and Technical documents.
- Design for implementation of the database.
- Worked on the Redesign of the TLM project for converting business specifications to technical specification individually.
- Worked on the compatibility of software’s for project set up.
Confidential
Jr. Developer
Responsibilities:
- Extensively worked as a production support to resolve any issues in production
- Monitored the workflows, which are running in production on daily basis.
- Resolved performance issues that were raised in production.
- Assisted in Migrating Repository objects using Repository Manager between various environments.
- Exported universes to the repository to making resources available to the users.
- Daily monitoring the Broadcast Scheduler and performing action against reported issues.
- Scheduling reports on BCA, making changes to Universes through Designer tool of BO and promoting them to LIVE.
- Involved in making changes to Reports through BO reporting tool and Promoting them LIVE.
Confidential
Jr. Developer
Responsibilities:
- Involved in developing of the ETL (Extract, Transformation and Load) strategy to populate the Data Warehouse from various source systems (Oracle), flat files feeds using Informatica.
- Worked on Informatica - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplets and Transformations.
- Used various Transformations like Source Qualifier, Joiner, Aggregator, Update Strategy, Expression, Sequence Generator and Filter for better data messaging and to migrate clean and consistent data.
- Created Informatica mappings to build business rules to load the data into data warehouse.
- Created Mapplets and Reusable Transformations.
- Used the Workflow Manager to create Sessions.
- Involved in the unit testing to check the data consistency.
- Used Informatica Workflow Manager to create Sessions, Workflows to run with the logic embedded in the mappings