We provide IT Staff Augmentation Services!

Sr. Etl -bi Consultant Resume Profile

3.00/5 (Submit Your Rating)

Atlanta, Ga

Professional Summary

  • 12 years of IT Experience in Data Integration, Data Architecture, Solution Design, Development and Implementation.
  • 11 years of Enterprise Data Warehouse EDW Solution Design and Development using Data Integration tools.
  • 11 years of programming experience in ORACLE SQL /PL-SQL, INFORMATICA and UNIX.
  • 7 years of development experience in BI Tools like OBIEE, SAP BW and CRM tools such as SIEBEL.
  • 7 years of Solution Architecture and Design of EBI Enterprise Business Intelligence Applications.
  • 2 years of development experience in Big Data Analytics by Hadoop, Hive and Splunk for Log Analytics.

Skills Summary

  • Sr. Data Integration Architect for Enterprise Business Intelligence specialized in Data Architecture and Design.
  • Leader of Project/Delivery Teams at multi-geographies to implement Managed Service MS project models.
  • Expert in Informatica PowerCenter, PowerExchange CDC, IDQ and Server Management/Admin Console.
  • Expert in Data Modeling ,Data Integration, Data Migration, Data consolidation and Data Cleansing.
  • Extensive experience in ETL solutions for VLDB and involved in Several Full Life Cycle SDLC implementations.
  • Experience in developing, validating, publishing, maintaining LOGICAL / PHYSICAL data models and managing meta-data for data models. Experience in Relational and Dimensional Star/Snowflake/Hybrid modeling.
  • Experience with developing Data Quality Solutions for large data warehouses.
  • Experience in Leveraging and enforcing standards and best practices to ensure consistency and reusability of data models.
  • Experience in performing reverse engineering of physical data models from databases and SQL scripts.
  • Capable in reviewing, critiquing and reviewing project team's work. Experience in AGILE development environment.
  • Experience with large multi-Terabyte >15TB implementations of both structured and unstructured data
  • Experience with large scale distributed implementations of data warehousing landscapes using database platforms IBM DB2, EMC2 Greenplum, Oracle ExaData and SAS.
  • Skilled in creating Design Documents, Coding Standards, Best Practices and Error and Exception Handling Strategies.
  • Excellent Experience in optimizing Mappings and implementing the complex business rules by re-usable objects.
  • Managed Deployment scripts, initial loads and build scripts for database and Informatica code deployment.
  • Strong Oracle PL/SQL development/implementation skills with tables, Indexes, Synonyms, Views and Materialized Views and Partitioning, Partition Exchanges , Indexing and advanced database application maintenance.
  • Excellent communication and problem solving skills and consistently demonstrates out-of-box thinking.

Technical Skills

Database: Oracle Exadata/10g/9i/8i/8.0/7.0,IBM DB2

ETL/DWH: INFORMATICA Power Center 8.6,8.1.1/7.1/6.2.Change Data Capture,

Data Quality,Dimensional Modeling/Star Schema,Snowflake Schema, Facts and Dimensions.

Programming/Scripting Oracle SQL/PL-SQL, UNIX Shell Perl Scripting, Hadoop, Hive,splunk.

Business products MorningStar/Lipper Funds /Fixed Income Products,SCI -Stocks/Shares / Special Assets.

Functional Domains Investment Banking, Wealth Management, Retail Banking , Marketing and Research,TeleCommunications and Oil and Gas Production Industry.

Applications/Utilities Autosys, MQC,Tumbleveed,DBArtisan,RapidSQL,WinSCP,Avaloq,Axiom,TLM,

SQL Loader,Clear Case,Subversion,WCC

Professional Experience

Sr. Data Integration Architect/Sr. ETL -BI Consultant

Confidential

Confidential is the third-largest communications company in . It is one of the largest cable television providers in the. It is a privately owned subsidiary of Cox Enterprises providing digital cable television, telecommunications and wireless services in the United States. It is serving more than 6.2 million customers, including 2.9 million digital cable subscribers, 3.5 million Internet subscribers, and almost 3.2 million digital telephone subscribers, making it the seventh-largest telephone carrier in the country.

Responsibilities:

  • Solution Architecture of the EBI systems and provide high-level design and work with EAs Enterprise Architects .
  • Act as the Data Architect and provide end-to-end technical guidance for smooth flow of the data from ODS to EDW.
  • Provide technical guidance to project teams in the development of project prototypes/project scoping efforts.
  • Handle projects in the Oracle Exadata, Informatica, OBIEE and SAS environments.
  • Insuring the integration of metadata across the tool suite. Insure the usage and development of the Meta Data repository
  • Assisting in defining data quality metrics, standards and guidelines for using data
  • Drafting and communicating data integration strategies and visions
  • Assuring that sensitive data, regardless of format, is protected at all times by only using approved equipment, networks and other controls
  • Championing the integration of data governance within the standard project methodology
  • Ensuring that standard project methodology is followed and that policies, procedures and metrics are in place for maintaining/improving data quality and the creation, capture and maintenance of metadata
  • Ensuring that all strategic data is modeled, named and defined consistently
  • Ensuring that projects source and utilize data as much as is feasible from the designated system of record
  • Supporting/sharing knowledge with other architects
  • Communicating new and changed business requirements, issues and problems to individuals who may be impacted
  • Attend Design meetings, Code review meetings and provide feedback. Tech lead several outsourced projects.
  • Work with Unica campaign management team to promote new products and services to existing and new customers.
  • Work with Third Party Marketing vendors for email marketing of new promotions to get new customers.
  • Communicating concerns, issues and problems with data to the individuals who can influence change
  • Data analysis and ETL design on assigned projects. Document Requirements, design, and develop Informatica code.
  • Design, Develop and modify Informatica workflows and mappings.
  • Aid in gathering requirements, conducting business analysis, and writing technical design specifications.
  • Evaluate and design logical and physical databases define logical views and physical data structures.
  • Validates test plans and test scenarios verify test results. Help QA teams to understand business/technical requirements.

Environment: Informatica 9.1, Oracle Exadata, OBIEE Reporting ,SAS and Splunk Analytics.

Confidential

Data Architect/ETL Specialist /Sr. ETL -BI Consultant

Confidential is a global information technology research and advisory company providing technology related insight. Products provided by Gartner are targeted at CIOs and senior IT leaders in various industries that include government agencies, high-tech and telecom enterprises, professional services firms, and technology investors. The company consists of Research, Executive Programs, Consulting and Events. Gartner conducts various Webinar, IT Meets, symposiums and research on various Products on the market. In order to reach out to various Contacts and Prospects depending up on their interests, a project is designed to build a Confidential This Marketing intelligence DW will act as a Consolidated CRM platform which will be used by Various Marketing and Research Teams. As a warehouse it consolidates all the Information received from various external/internal systems, cleanses the data and loads them into a warehouse based on business needs. This BAW Business Analytics Warehouse allows to allows consolidation and segmentation of marketing needs for launching campaigns and symposiums for various clients from several business functions.

Responsibilities:

  • Work closely with Business Users and Business Analysts to gather functional and technical requirements.
  • Design and Implement ETL solutions for complex functional and technical requirements using Informatica PowerCenter 8.6.
  • Create custom solutions using Informatica mappings/mapplets/re-usable transformations/shared objects/shortcuts/sessions/workflows/worklets etc.
  • Engage Data Architecture team for Data Modeling and DBA for database object creations.
  • Involve in Logical data model design and work closely with physical implementations by architecture team.
  • Closely work on the relational, dimensional modeling aspects of the DW design and participate in brainstorm sessions.
  • Implement Error Handling, Exception handling and logging mechanism and follow Standards and Best practices in coding.
  • Implement Data Archiving, Data Reconciliation using Informatica/SQL/Unix components.
  • Create shell scripts for various requirements. Defining the new process for the application
  • Create Weservice calls , HTTP transformation utilities and XML source target definitions thru informatica
  • Work on database Triggers, Stored Procedures, Functions and Constraints.
  • Write complex stored procedures and triggers and optimized them for maximum performance.
  • Utilize Oracle Utilities like Imports / Exports, SQL Loader for large scale data loads
  • Prepare Weekly Monthly Application Performance Chart and chairing the calls
  • Unit testing for the developed mappings and prepare Unit Test Specification Requirements.
  • Handling Testing Execution sequence Making sure that deliverables are on track.
  • Created UNIX scripts for ETL process like File validation, Remote upload/download, FTP,SFTP and basic file manipulations
  • Reviewing Unit testing results and coordinating UAT Testing
  • Provide debug, analysis, and verification support for component/Integration/E2E and UAT test cycles.
  • Assign code review and participate in peer review of several ETL components.
  • Develop data migration strategies and processes for production deployment.
  • Taking care of PROD issues and release issues
  • Support production support team and Operations team in their day-to-day needs.
  • Problem and Incident Management Responding to production incidents, tracking incidents to resolution and closure
  • Coordinate on cross-team requirements between up-stream / down-stream / Application Architects Enterprise teams.

Environment: Informatica 8.6, Oracle 11 G, DAC,Autosys, SEIBEL

Confidential

Data Architect/ETL Specialist /Sr. ETL -BI Consultant

Confidential multinational corporation based in Greater Sharpstown, Houston, Texas, 2 3 is a leading worldwide provider of equipment and components used in oil and gas drilling and production operations, oilfield services, and supply chain integration services to the upstream oil and gas industry. The Company conducts operations in over 1,160 locations across six continents. The Company's common stock is traded on the New York Stock Exchange under the symbol NOV . The Company operates through three reporting segments: Rig Technology, Petroleum Services Supplies, and Distribution Transmission.

Responsibilities:

  • Work closely with Business Users and Business Analysts to gather functional and technical requirements.
  • Design and Implement ETL solutions for complex functional and technical requirements using Informatica PowerCenter 8.6.
  • Create custom solutions using Informatica mappings/mapplets/re-usable transformations/shared objects/shortcuts/sessions/workflows/worklets etc.
  • Engage Data Architecture team for Data Modeling and DBA for database object creations.
  • Involve in Logical data model design and work closely with physical implementations by architecture team.
  • Closely work on the relational, dimensional modeling aspects of the DW design and participate in brainstorm sessions.
  • Implement Error Handling, Exception handling and logging mechanism and follow Standards and Best practices in coding.
  • Implement Data Archiving, Data Reconciliation using Informatica/SQL/Unix components.
  • Create shell scripts for various requirements. Defining the new process for the application
  • Create Weservice calls , HTTP transformation utilities and XML source target definitions thru informatica
  • Work on database Triggers, Stored Procedures, Functions and Constraints.
  • Write complex stored procedures and triggers and optimized them for maximum performance.
  • Utilize Oracle Utilities like Imports / Exports, SQL Loader for large scale data loads
  • Prepare Weekly Monthly Application Performance Chart and chairing the calls
  • Unit testing for the developed mappings and prepare Unit Test Specification Requirements.
  • Handling Testing Execution sequence Making sure that deliverables are on track.
  • Created UNIX scripts for ETL process like File validation, Remote upload/download, FTP,SFTP and basic file manipulations
  • Reviewing Unit testing results and coordinating UAT Testing
  • Provide debug, analysis, and verification support for component/Integration/E2E and UAT test cycles.
  • Assign code review and participate in peer review of several ETL components.
  • Develop data migration strategies and processes for production deployment.
  • Taking care of PROD issues and release issues
  • Support production support team and Operations team in their day-to-day needs.
  • Problem and Incident Management Responding to production incidents, tracking incidents to resolution and closure
  • Coordinate on cross-team requirements between up-stream / down-stream / Application Architects Enterprise teams.

Environment: Informatica 8.6, Oracle 11 G,EBS

Confidential

AVP Technical ETL/WMR-DWH July 2010 May 2011

Wealth management reporting WMR is one of the highly acclaimed projects within Barclays. It has several subsystems like Investment catalogue IC , Client requisition System CRC , Data Consolidation DC , On demand reporting ODR and Performance Reporting. The project I worked is the data consolidation DC process which is the heart of WMR system. DC acts as the core engine to provide consolidated information for the client's statements and performance reporting. DC is a system that performs numerous tasks like data acquisition, enrichment, reconciliation and exception reporting etc. It also acts as the medium for correction processing thru operation tools. DC acquires settlement system data thru daily feeds and uses several reference data provider morning star, MSCI, Lipper etc feeds to get information for market value calculation, Accrued interest processing product classification etc.

Responsibilities:

Same as above project. Additional details are below.

  • Design Data Flow for the ETL process and integrate the data mapping for optimal performance.
  • Data analysis and Data modeling, discussions with Data Architecture team.
  • Involve in Logical data model design and work closely with physical implementations by architecture team.
  • Closely work on the dimensional modeling aspects of the DW database design and participate in brainstorm sessions.
  • Responsible for siebel analytics data load using informatica for CRC CRM module .
  • Create Autosys batch jobs for custom ETL components and assign run conditions for the proper processing of the data.
  • Involved in Performance Tuning.
  • Worked in full Software Life Cycle/from requirement gathering to testing, Implementation, Deployment and Support.
  • Created reusable metadata by making the Shared Global Repository objects.
  • Responsible to tune ETL processes, complex mappings and Star Schemas for optimize loading and query performance.
  • Developed interfaces using Shell Scripts to automate the bulk load update the processes.
  • Creating and enhancing PLSQL programs.
  • Implement PL-SQL components to support ETL process. Implement Performance Tuning methods, Partitioning and use Pushdown Optimization in Informatica to support large volumes of data to process.
  • Design and implement PL-SQL stored procedures, functions, constraints, indexes, tables, materialized views, global temporary tables for various needs.
  • Work on table partition maintenance, merge, partition exchange and index- rebuilding techniques.

Environment: Informatica 8.6, DB2, UNIX, DB Artisan, Rapid SQL, Autosys

Confidential

1 DB-ETL Project Lead

Confidential This project is to create Opening Balance Data mart within GGL global general ledger for PnL Profit n Loss and BS Balance sheet reporting. FBI Finance Balance Integrator application which runs on a java based technology is introduced as the common platform for balance aggregation for various reporting needs. To achieve this we introduced a new design to divide the weekend archiving process into weekday and weekend process and load data into OB datamarts. Using FBI application introduced to generate balance missions for Opening balance. This allows new opening balance to be published on various data marts within GGL system in a better way.

Responsibilities:

  • Work closely with Business users and Business Analysts to gather requirements.
  • Analysis of Source Data and designing the complete ETL process for the smooth flow of the data.
  • Design and Implement ETL solutions for complex functional and technical requirements using Informatica PowerCenter 8.6.
  • Create custom solutions using Informatica mappings/mapplets/re-usable transformations/shared objects/shortcuts/sessions/workflows/worklets etc.
  • Creating complex mapping, promoting the code from one environment to another Environment.
  • Design Data Flow for the ETL process and integrate the data mapping for optimal performance.
  • Process data from various sources as flatfiles,relational tables,xml and load to BAW oracle and SAP BW tables
  • Create Autosys batch jobs for custom ETL components and assign run conditions for the proper processing of the data.
  • Validation and testing of the ETL components.
  • Unit testing for the developed mappings. Prepare Unit Test Specification Requirements.
  • Worked in full Software Life Cycle/Production Deployment and Support.
  • Created reusable metadata by making the Shared Global Repository objects.
  • Responsible to tune ETL processes and Star Schemas to optimize load and query performance.
  • Developed interfaces using Shell Scripts to automate the bulk load update the processes.
  • Worked on Database Triggers, Stored Procedures, Functions and Database Constraints.
  • Wrote complex stored procedures and triggers and optimized them for maximum performance.
  • Create procedures, functions, triggers and packages for the new requirement.
  • Utilize Oracle Utilities like Imports / Exports, SQL Loader for large scale data loads
  • Handling Testing Execution sequence Making sure that deliverables are on track.
  • Reviewing Unit testing results and coordinating UAT Testing
  • Provide debug, analysis, and verification support for UAT/Parallel testing during UAT cycles.
  • Assign code review and participate in peer review of other components.
  • Develop data migration strategies and processes for production deployment.
  • Production Implementation and Support.
  • Problem and Incident Management Responding to production incidents, tracking incidents to resolution and closure

Environment: Informatica 8.1, Oracle 10g, UNIX, Windows NT, TOAD 9, Autosys, Subversion

Confidential

Global Year-End Carry Forward System: This project is to create a new YE Year End process which carry forward previous year Ledger balance into new year. This process is very crucial and critical for the investment bank to achieve its Yearend Closure on time. Various complex logics been processed within the Primary Carry forward and incremental carry forward to extract and transform year old data and load into YE specific targets.

Responsibilities:

  • Analysis of current Informatica, UNIX, Oracle environments.
  • Interaction with business and BAs to understand new requirements and prepare technical documents.
  • Unit testing and UAT Testing
  • Developing Jobs, Workflows and Data Flows for Data mart.
  • Integration of Data Flows.
  • Unit Testing for the developed mappings.
  • Defining Coding Standards for Data Integrator.
  • Create transformations and mappings using designer tools of Informatica.
  • Involve in Logical data model design and work closely with physical implementations by architecture team.
  • Involved in Performance Tuning.
  • Prepared Unit Test Specification Requirements.
  • Worked in full Software Life Cycle/Production Deployment and Support.
  • Developed mappings and also tuned them for better performance.
  • Developed interfaces using Shell Scripts to automate the bulk load update the processes.
  • Work on Database Triggers, Stored Procedures, Functions and Database Constraints.
  • Write complex stored procedures and triggers and optimized them for maximum performance.

Environment: Informatica 8x, Oracle 10g, Erwin 4.1, PL/SQL, SQL Loader, UNIX, Windows NT, MS Excel

Confidential

Wipro Technologies

Confidential The project was to create a new Extracts for TLM processing. These extracts are very special in its format and the data in it. These are flat file extracts and serves as daily feeds for TLM to reconciliation system. The extracts send separate files for each region and the information in the file is separated as opening balance, details and closing balance.TLM system reads data from the daily feed files and ledger and reconciles the balances and report breaks if they do not match.

Responsibilities:

  • Conduct discussions with the users to gather business requirements.
  • Designing of Data Flow.
  • Developing Jobs, Workflows and Data Flows for Data mart.
  • Integration of Data Flows.
  • Validation and Testing of the ETL work products.
  • Unit Testing for the developed mappings. Prepared Unit Test Specification Requirements.
  • Defining Coding Standards for Data Integrator.
  • Create transformations and mappings using designer tools of Informatica.
  • Created reusable metadata by making the Shared Global Repository objects.
  • Developed mappings and also tuned them for better performance.
  • Responsible to tune ETL procedures and Star Schemas to optimize load and query performance.
  • Developed interfaces using Shell Scripts to automate the bulk load update the processes.
  • Worked on Database Triggers, Stored Procedures, Functions and Database Constraints.
  • Wrote complex stored procedures and triggers and optimized them for maximum performance.

Environment: Oracle 8i/9i, PL/SQL, Informatica, UNIX Shell Scripting, TOAD 7.1

Confidential

AXIOM-Regulatory Datamart: Axiom is a third party tool used within the bank for Regulatory BASEL II reporting. The project Axiom datamart was to prepare the staging area for AXIOM reporting.

Responsibilities:

  • Conduct discussions with the users to gather business requirements.
  • Designing of Data Flow. Integration of Data Flows.
  • Create transformations and mappings using designer tools of Informatica.
  • Developing Jobs, Workflows and Data Flows for Data mart.
  • Validation and Testing of the ETL work products. Unit Testing for the developed mappings.
  • Defining Coding Standards and follow ETL best practices. Prepared Unit Test Specification Requirements.
  • Worked in full Software Life Cycle/Production Deployment and Support.
  • Involved in Performance Tuning. Tune ETL procedures and Star Schemas to optimize load and query performance.
  • Developed interfaces using Shell Scripts to automate the bulk load update the processes.
  • Worked on Database Triggers, Stored Procedures, Functions and Database Constraints.
  • Wrote complex stored procedures and triggers and optimized them for maximum performance.

Environment: Oracle 8i/9i, PL/SQL, DB2, Informatica, UNIX Shell Scripting, TOAD 7.1

Confidential

ICE-PnL Reporting Suite: Integrated Control Environment ICE is an application built specially for profit and loss accounts within the Investment bank. The project dealt with ICE Pnl datamarts to produce regulatory reports from Business Objects. Integration of various modules which are posting data into the application and reports out of the datamarts were the mail goal of this project.

Responsibilities:

  • Conduct discussions with the users to gather business requirements.
  • Create transformations and mappings using designer tools of Informatica. Responsible to tune ETL procedures
  • Unit testing for the developed mappings.
  • Defining Coding Standards for Data Integrator.
  • Worked in full Software Life Cycle/Production Deployment and Support.
  • Developed mappings and also tuned them for better performance.
  • Developed interfaces using Shell Scripts to automate the bulk load update the processes.
  • Worked on Database Triggers, Stored Procedures, Functions and Database Constraints.
  • Wrote complex stored procedures and triggers and optimized them for maximum performance.
  • Create tables, partitions, indexes and also write complex SQLs to analyze or process data.

Environment: Oracle 8i/9i, PL/SQL, MS Excel, UNIX Shell Scripting, TOAD, DB Artisan, Subversion

Confidential

Jr.Software Engineer Banking and Finance Team

Confidential brothers is a global financial services firm doing business in investment banking, equity and fixed-income sales and trading especially Confidential. Treasury securities , research, investment management, private equity, and private banking. Worked on the IT division of asset management for integrating Thomson systems with Lincoln Capital Management's fixed income business systems, Neuberger Berman and The Crossroads Group due to their aquisitions.This project integrated all these external systems to Lehman AMR system.

Responsibilities:

  • ETL Development using Informatica Power center and Oracle database
  • Data Analysis and Test Data Preparation and Unit testing
  • Production Support and Post Implementation enhancement development
  • Performance Enhancements and Implementation Documentation

We'd love your feedback!