Data Warehouse Engineer/analyst Resume
San Francisco, CA
SUMMARY
- 10+ years of technology experience in Enterprise Data Warehouse (ETL), Data Analysis, Business Intelligence and Large - Scale Data Integration Applications.
- Expert level experience in Data Integration and Management Tools like Informatica Power Center, IDQ 9x, Test Data Management, Web Services, Metadata Management, Syncsort, Erwin Data Modeler and ER/Studio.
- Experience in Informatica Power Center Designer to analyze the multiple source systems like (Oracle, DB2, Teradata, SQL Server, Vertica, Cobol, XML and flat files) and based on the business requirements build the transformation rules to Extract, Transform & Load the data in various target systems.
- Hands on experience in scripting languages like Linux Scripting, Python, Perl.
- Extensive Involvement in Enterprise Data Architecture, ETL Development, Analysis and Design of various Financial Data Marts, Implemented Master Data Management and participated in the Enterprise level Reference Data Management solutions for Financial advisors.
- Involved in Data governance program including enterprise metadata management, data lineage, data quality, business glossaries and data stewardship for over 30+ SORs
- Hands on experience and expertise in building Data Models, Data Dictionary and Metadata from scratch or refactor existing data models to improve performance, correctness, and scalability using tools like Erwin, ER/Studio.
- Created Business dashboard reports, collaborated with various application groups/Data teams, using data visualization tools like Tableau, Hyperion Reporting Studio to create business intelligence reports for financial advisors.
- Designed and created complex Hierarchical/Summarized tables to facilitate BI/Dashboard reports for Business users.
- Experience in both functional and technical design, Data Integration (ETL) including Real-Time Data Architecture, Data Quality and data design for Enterprise Data & Client Insights (EDCI) within Wealth Management Advisory groups.
- Vast experience in Agile methodology
- A communicative team player committed to excelling in quality and meeting project deadlines.
TECHNICAL SKILLS
- Business Reporting, Enterprise Data & Client Insights (EDCI)Master Data Management (MDM), Reference Data Management (RDM)Test Data Management (TDM), Metadata Management, Data LineageWeb Services, Data Governance, Data Quality, Data Profiling, Data AuditData Security Models, Quality Assurance (QA) and Job Automation design.
- Informatica Power Center 7x./9x, Data Quality 9x, Informatica TDM 9x, Syncsort (DMExpress), SSIS, Visual Studio, SQL * Loader, PL/SQL, SQL, Linux Scripting, Python, Perl, Fastload, Multiload, BTEQ, Pentaho Data Integration.
- Visual Basic, C#, PL/SQL, SQL, Python, Unix Shell Scripting, PowerShell Scripting
- Tableau 10x, Hyperion Reporting Studio, Business Objects, OBIEE 11g, Crystal Reports.
- Erwin Data Modeler 7.x, Embarcadero ER/Studio 7.x/9.x
- Oracle 10g/11g, IBM DB2 9.x, Microsoft SQL Server 2012, Teradata 13.x,
- Vertica 7.x
- MS DOS, Windows 2003/NT/XP/2012, UNIX (IBM AIX)
- Red Hat Linux Enterprise
- Ralph Kimball & Bill Inmon Methodology, Dimensional Modeling, RUP (use cases) & UML, TOGAF 9
- WSDL, XML/XSD, Service Now, Remedy AR System 7x, JIRA Bug TrackingFile Transfer (WinSCP), TOAD, Seeburger (File Transfer Management), Control-MAutosys (Enterprise Scheduling tool)
PROFESSIONAL EXPERIENCE
Data Warehouse Engineer/Analyst
Confidential, San Francisco, CA
Responsibilities:
- Developed complex Informatica Mappings, Sessions, and Workflows using Informatica Mapping Designer, Workflow Manager respectively.
- Performed the data profiling and analysis making use of Informatica Data Quality (IDQ) .
- Created complex SQL test scripts to validate data across various Dimension and Fact tables like account, customer, product and transactions.
- Created DQ scorecards in IDQ Developer and worked on Address doctor to standardize addresses across the organization.
- Translated Business Requirements into a technical solution.
- Provided documentation (Data Mapping, Technical Specifications, Production Support, data dictionaries, test cases, etc.) for all projects.
- Managed the end-to-end lifecycle for new data migration, data integration, and data services for Customer Insight dashboards.
- Engaged with implementation partners in application testing, release management, and operations to ensure quality of code development, deployment and post-production support.
- Developed Shell scripts, Python scripts for large Data file processing to facilitate end to end data integration and performance management.
- Lead all aspects of software applications or product vendor implementations as needed to deploy solutions that integrate with EDCI systems and / or processes.
- Identified the bottlenecks in the sources, targets, mappings, sessions and resolved the problems of performance.
- Collaborated with our Data Warehousing peers, leads and other stakeholders to further expand and develop the Enterprise Data Warehouse platform.
- Proven ability to do data management activities, such as data profiling, data modeling, and metadata management a plus.
- Creating Tableau dashboard reports, collaborating with developer teams, using data visualization tools, and using the Tableau platform to create business intelligence reports for financial advisors.
- Worked with job scheduling tool Control-M for Enterprise wide job scheduling and ETL batch management processes.
Environment: Informatica Power Center 9.x., IDQ 9.x, Metadata Management, Informatica TDM 9.x, Web Services, Tableau 10x, PL/SQL, SQL Server 2008/2012, Oracle 11g, UNIX shell scripting, Python Scripting, SQL * Loader, Tableau, Control-M, Red Hat Enterprise Linux, Data modeling (Embarcadero ER Studio), SQL Developer, TOAD.
Senior ETL Developer
Confidential
Responsibilities:
- Planning and Requirement analysis for large scale Centralized Wealth Data Warehouse.
- Participated in design and development of the ETL/BI architecture for business intelligence solution components including database design, ETL & Report design and System Integration.
- Created Data Quality Plans for Standardizing/Cleaning/Matching for Wealth Management using Informatica IDQ 9.x
- Extensively worked with Informatica 9.x, DB2 9.x, Oracle 10g/11g, Teradata TD13, PL/SQL Procedures, Unix Shell Scripts, Autosys Scheduler, Syncsort (DMExpress) and created BI platform for Aggregated/Summarized wealth reporting system.
- Participate and lead UAT effort on all iteration planning and feature implementations.
- Developed complex Informatica Mappings, Sessions, and Workflows using Informatica Mapping Designer, Workflow Manager respectively.
- Created complex SQL test scripts to validate data across various Dimension and Fact tables like account, customer, product and transactions.
- Communicating with Investment Management's Business Analysts and Metadata System Analyst for gathering Requirements.
- Created ETL Design documents and technical specifications for ETL team to understand the Star & Snowflake Schema designs related to Fact & Dimension Tables in MDM.
- Worked with Data Architect Tools like Embarcadero ER Studio to maintain the MDM Data models and also and also managed database objects like triggers, stored procedures, sequences and indexes containing PL/SQL in the database and monitored overall performance management. created test plan and scripts for aggregated/summarized Data Mart Tables for Client Link Business Intelligence Applications. Also analyzed highly complex business requirements, design and have created technical specifications to develop ETL (Informatica) and DB processes.
- Tested the core functionalities for Enterprise level ETL Batch Framework and also validated user and system Driven in-house scheduling applications to help to minimize the amount of IT and Business resources required for an organization's ETL deployment process and to optimize the operations and support activities.
- Performed and managed the database backup and recovery.
- Created database objects including tables, triggers, views, stored procedures, indexes, defaults and rules.
- Performed the maintenance of the clustered and non-clustered indexes.
- Created Hyperion Reports, collaborated with business teams for dashboard requirements, built and populated summarized/aggregated tables for BI Reporting, using Hyperion reporting studio.
- Created & monitored history jobs for maximum availability of data & to ensure consistency of databases.
- Coordinating with reporting team(s) and Business service team(s) to resolve & fix the issues.
- Leading the project independently and to communicate with Business Analysts, and QA team as required.
Environment: Informatica Power Center 9.x., Syncsort (DME Express), DB2 9.x, Teradata TD 13.x, SQL, 2008/2012, Oracle, UNIX shell scripting, PL/SQL, SQL * Loader, OBIEE 11g, Autosys, Sun Solaris Unix, Data modeling (Embarcadero ER Studio), SQL *Plus, TOAD.
Sr. System Engineer
Confidential
Responsibilities:
- Extensively worked with Informatica 8.x, Power Exchange 8.x, Oracle 10g/11g, PL/SQL Procedures, UNIX Shell Scripts, Business Objects Report development.
- Facilitates the gathering, consolidation and prioritization of client business requirements for various ETL Developments across the enterprise.
- Participated in designing and developing of the ETL/BI architecture for all business intelligence solution components including database design, ETL, report design and system integration.
- Developed complex Mappings, Sessions, and Workflows using Informatica Mapping Designer, Workflow Manager respectively.
- Designed ETL specification, PL/SQL Procedures and Unix Scripting to populate reporting data mart for Remaining Principal Balance RPB application associated with different customers, Investors and vendors.
- Documented user requirements, translated requirements into system solutions, and developed implementation plan and schedule.
- Extracted data from different sources like Oracle, SQL Server, DB2, Flat Files, and XML.
- Created multiple UNIX shell scripts to conditionally validate, schedule and automate the ETL tasks/sessions and performed required actions on Oracle databases.
- Issue Tracking and Defect Collection.
- Used Erwin to keep track of database model changes associated with ETL design.
- Version Controlling Using Visual Source Safe.
- Developed UNIX Shell scripts for automation Process and Involved in SIT and Production.
Environment: Informatica Power Center 8.6 (Source Analyzer, Warehouse Designer, Mapplet Designer, Mapping Designer and Transformation developer), Remedy (AR System 6.x), Power Exchange 8.x, Oracle 10g/11g, Vertica 3.0/4.0, UNIX, Cognos, DB2, PL/SQL DEVELOPER, SQL, PL/SQL, SQL * Loader, TOAD, Erwin, Pentaho Data Integration.