Data Architect Resume
Minneapolis, MN
EXPERIENCE SUMMARY:
- Seasoned DW/BI and ETL Architect, Designer and Developer with over 11 years of work experience in IT industry with vast exposure in design and development of Data Warehouse on Relational (Oracle and Teradata) and Cloud (Amazon Redshift) databases.
- Effectively involved in various phases of Data warehousing projects that include Requirements Gathering, Analysis, Design, Development, Testing in Oracle and Teradata Technology and Informatica ETL tool with good understanding in Retail and Financial services industry.
- Strong analytical skills combined with good experience in ETL process in Data Warehousing lifecycle with good inter - personal, written and oral communication skills.
- Experience in leading and managing teams. Handled multiple roles including Data Architect, Team Lead, Onsite coordinator, BI Developer.
- Expert in Database and ETL Architecture using Oracle/RedShift/Teradata/Informatica/Unix.
- Worked in Retail Markets, Mortgage Banking and Credit Card domain.
- Expert in writing complex PL/SQL queries and stored procedures/packages.
- Proficient in Unix shell scripting, created generic and dynamic scripts.
- Key member in setting Informatica standards and best practices for the project.
- Good knowledge of Real Time Data Integration with Informatica MQ.
- Designed and Conducted Informatica and Oracle training sessions and mentored peers.
- Proficient in analyzing and translating business requirements to technical design and architecture.
TECHNICAL SKILLS:
ETL Tools: Informatica 6.x/7.x/8.x/9.x, Amazon Datanet
Operating Systems: WINDOWS, Sun Solaris7.0/8.0, Linux 5.x, Amazon Linux
Databases and Tools: Oracle 9i/10g/11g, Teradata, Netezza, RedShift, MySQL, Vertica Callidus TrueComp, SAP, Amazon Web Services (AWS)
Programming Languages: SQL, PL/SQL, Visual Basic 6.0, C, VB.NET, Unix
Other Utilities: MS Visio, TOAD, SQL Navigator, Aqua, SQL Developer, Heat, Tidal Scheduler, Marval, SCM Surround, Microsoft VSS, Quality Centre, Test Track Pro, Jira, Peregrine, Sub version, Control M, Amazon Brazil/Apollo, Amazon S3, EC2, ScrumWorks, Nexus Query Chameleon
PROFESSIONAL EXPERIENCE:
Confidential, Minneapolis, MN
Technology: Informatica 9.6.1, Oracle 11g, SAP, Unix
Data Architect
Responsibilities:
- Architect and Lead the Conversion Program for Pricing data
- Analyze the Pricing data for Covidien and Integrate into Confidential .
- Serve as Architect and BI Engineer working on Architecture, Design, ETL, Development and Operations.
- Extract Pricing data from Legacy Systems E1(JDE) and ET’s legacy SAP system, Convert and Transform data as per the Pricing Group Types and Contract Types.
- Allocate the Pricing Condition Types and Condition Tables, Convert customers and materials as per the Confidential SAP system.
- Design requirements for the LSMW Programs/IDocs to load the cleansed and transformed data into Confidential
- Design the data load strategy into SAP (Over 60 million pricing records)
- Provide ETL design and testing for the new requirements.
- Design and Develop Oracle PL/SQL scripts, Informatica Mappings/Workflows and UNIX scripts for the data load processes.
- Provide reports on the different types of pricing data
- Coordinate work with cross-functional areas of the project (Legacy system Business and IT team, Confidential IT functional team, SAP development team etc.)
Confidential, Bothell, WA
Technology: Informatica 9.6.1, Oracle 11g, Netezza, Vertica, Unix
Business Intelligence Engineer
Responsibilities:
- Analyze, design and build reporting solutions around a variety of web related performance and visitor behavior in the eCommerce Reporting Datamart.
- Serve as BI Engineer working back-end (Architecture, ETL, Development and Operations).
- Design and Migrate the current Clickstream data coming thru Webtrends and OLAM to Adobe add more wireline data and Cross Channel data (call center).
- Work on migrating the Datamart from Netezza to Vertica.
- Assist the teams that report on customer facing att.com/wireless websites and large in-house data mart (up to billions of rows per table) that tracks various clickstream, order and subscriber metrics for the eCommerce channel; in addition to other tools.
- Design solutions to solve complex integrated reporting requirements for various reporting releases.
- Review requirements with project members, provide ETL design and testing for implementing new requirements, documentation and transfer knowledge to operations specialists to sustain the new job(s).
- Design and Develop Oracle PL/SQL scripts, Netezza SQL scripts, Informatica Mappings/Workflows and UNIX scripts for the data load processes.
- Provide solution to Adhoc special queries related to a given project.
- Interface with the source systems’ technical teams, business teams, reporting analyst and others.
Confidential, Seattle, WA
Technology: Amazon Datanet, Oracle 11g, RedShift, MySQL, Unix, Amazon Linux, Amazon AWS - S3, EC2
Business Intelligence Engineer
Responsibilities:
- Design and develop Architectural diagrams and the data flow processes.
- Review existing design of the data management systems, provide feedback on architecture standards and best practices.
- Maintain Databases including Oracle, MYSQL and Cloud DBs.
- Support BI platform, help business to come up with BI metrics and generate reports.
- Analyze data model, write business algorithms using PL/SQL stored procedures/packages and dynamic queries, and perform proof of concepts for new data architecture.
- Develop Data load processes using PL/SQL Packages and Informatica.
- Work on performance tuning to increase the throughput of the data loads, work on SQL tuning process by analyzing the queries with explain plan, adding hints, creating new indexes and partitions.
- Create One Stop Shop Data model for all Linehaul Tenders, Inbound Receipts and Invoices.
Confidential, Madison, WI
Technology: Informatica PowerCenter 9.1.0/9.6.0 , Oracle 11g, Unix, Amazon Linux, Amazon AWS - S3, EC2
Datawarehouse Architect
Responsibilities:
- Review Architecture of the existing Confidential Data warehouse and ETL architecture and provide feedback on design improvements and best practices.
- Create document templates and standards for Design, Technical Spec, Implementation and Test Plan and enforce the standards to all team members.
- Strategize Infrastructure and Data migration into Amazon with maintaining all standards
- Design the Architectural diagram for Amazon Infrastructure for Confidential DW and BI and the data flow processes.
- Design and Develop Data Retention processes using PL/SQL packages with dynamic SQL queries.
- Configure and Setup Amazon EC2 Servers.
- Designed and Developed code base to export the entire Database from Third Party Vendor, Upload to Amazon Cloud (S3), Download to Amazon server and Import into Amazon DB.
- Analyze the requirements, Architect, Design, Prepare/Review Technical Specification Document, Development, Code review and Unit Testing.
- Develop and Enhance the existing code base to be more generic and scalable.
- Design the data load process using Informatica, Oracle and Unix.
- Provide solution and work with all teams for the data load management for the entire Confidential
- Build and enhance the operational model for all online affiliate networks and all programs for those networks.
- Responsible for upgrading Informatica from 9.1.0 to 9.6.0
- Analyzed Clickstream data load and usage. And introduced better model including the restructure of tables along with partitioning, indexes.
- Worked on Data retention policy and archival/purge of all Clickstream tables that reduced the total DB size from 16 TB to 7 TB.
- Analyze slow performing jobs and provide solution for improvement - In both DB and ETL.
Confidential, Columbus, OH
Technology: Informatica PowerCenter 9.1, Teradata, Unix, Control M
Architect and Team Lead
Responsibilities:
- Lead the central common components team for Confidential
- Provide consulting and present the reference Architecture for all new LOBs coming into Confidential
- Review Design of Data, ETL and BI architecture for all teams and provide feedback on design improvements and best practices and approval
- Proposed Real Time reference Architecture for Confidential using Informatica MQ
- Provided solution and worked with all teams for the data load management for the entire Confidential platform
- Analyze, Architect, Design, Prepare/Review Technical Spec Document, Development, Code review and Unit Testing of common components.
- Develop and Enhance the Audit Model and Common Components for the entire Confidential
- Design, build and maintain mappings, sessions and workflows for the Confidential data load process using Informatica, Teradata and Unix
- Design and Schedule all the data load processes using Control M.
- Enhanced the performance of overall ETL load with different Teradata TPT utilities - Fastload, Multiload and Tpump
- Enhanced all generic UNIX scripts to accomplish Query Banding in Teradata for all SQLs, which helped in the analysis and improvement of complex, time consuming and resource consuming queries
Confidential, NJ
Technology: Informatica PowerCenter 9.1, Informatica PowerCenter 8.6.1/9.0, Oracle 10g/11 PL/SQL, Unix, True Comp
Team Lead
Responsibilities:
- Key player of Data Integration Team.
- Responsible for analyzing the business requirements and providing a feasibility report and estimates for each requirement.
- Responsible for Design, Technical Specification Document preparation, Development, Code review and Unit Testing of the process.
- Design, build and maintain mappings, sessions and workflows for the TrueComp data load process using Informatica, PL/SQL and Unix.
- Prepare test data, support SIT and UAT and resolve the defects.
- Responsible for production support of the Data load processes, Issue Analysis, Root Cause Evaluation, Issues Resolution and Closure
Confidential, Atlanta
Technology: Informatica PowerCenter 7.2.1, Oracle 10g PL/SQL, Netezza, Unix
Onsite Team Lead
Responsibilities:
- Single point of contact for the client in Confidential related activities.
- Played the role of Technical lead, Designer, Developer, Onsite coordinator and Analyst
- Coordinate the entire SDLC for each project.
- Integrate/Coordinate with different teams to gather and understand the precise requirements regarding the system, data load process and source data validation for each project.
- Preparation of the Requirements document/Work Request document.
- Preparation of time estimates and tracking the effort spent and issues encountered for the completion of each project.
- Responsible for Design, Technical Specification Document preparation, Development, Code review and Unit Testing of the process.
- Support for production Implementation, Validation of the production data to ensure that the data load has no discrepancy.
- Responsible for Production Support, Issue Analysis, Root Cause Evaluation, Issues Resolution and Closure.