We provide IT Staff Augmentation Services!

Enterprise Warehouse Data Analyst Resume

2.00/5 (Submit Your Rating)

Cleveland, OH

SUMMARY

  • Over 12+ years of experience as a Data Warehousing Specialist wif ETL, OBIEE Development.
  • Deeply understanding and doing analysis on STMs (Source to Target Mapping) provided by data experts, and data analysts.
  • Strong knowledge in Oracle, Teradata, and Hive data modeling and building different data marts as per Users’ requirements
  • Good experience in integration of various data sources wif Multiple Relational Databases like Oracle, and SQL Server and Worked on integrating data from non - relational sources like flat files and excel sheets.
  • Strong technical knowledge of Informatica ETL Ecosystems to provide big data and Data warehousing Analytics.
  • Strong Data Warehousing ETL experience using Informatica Power Center, Power Exchange & MDM HUB versions 10.1/9.6.1/9.5.1/9.1/8.6.1/8.5/8.1/7.1 Power Center Client Tools Mapping Designer, Repository manager, Workflow Manager/Monitor, Informatica Power Exchange.
  • Extensive experience in working wif Tableau Desktop, and Tableau Server in various versions of Tableau 9. X(9.3,9.0) & 8. X (8.3, 8.2, 8.1, 8.0) and Business Objects.
  • 2 years of experience in OFSAAI Data Entry Forms and Queries, Excel upload wif Checker and Maker Process, Metadata Restore/Archive Utility, and Data model create and upload process in Erwin Tool.
  • Performed various activities like project module tracking, Understanding the Nature of data wif the halp of a business Analyst, Design Discussion, Task Allocation, Code Review, Mentoring the Developers, Code Demo wif Business Users, code migration, Performing UAT and getting signoff, Release Management, Scheduling the Jobs and Production Turn Over Meetings.
  • Good knowledge of Data warehouse concepts and principals (Kimball/Inman) - Star Schema, Snowflake, SCD, Surrogate keys, Normalization/Denormalization.
  • Responsible for identifying the entities and their attributes to create the Entity-Relationship and Architecture diagram
  • Experienced in creating Logical Data Objects (LDO) and profiling for multiple sources in Informatica developer and analyst.
  • Passionate, hardworking developer and talented individual fluent in ETL Technologies and Data Analysis
  • Strong working experience in ETL (Informatica), Data Warehousing Concepts (SCD, Business intelligence, Data modeling, etc.), SQL, Shell Scripting, and Relational Databases like Teradata, and Oracle.
  • Experience in debugging mappings. Identified bugs in existing mappings by analyzing the data flow and evaluating transformations and emphasized on Performance of mapping Optimizations.
  • Excellent reputation for resolving problems and improving customer satisfaction.
  • Well-versed in data warehousing concepts such as Transforming Data, Data Organizing, Data Profiling, Data cleansing, and Data Lineage.
  • Organized approach to meeting multiple, concurrent deadlines and working wif Support teams for productionizing the code.
  • Pulls from active knowledge of the current technology landscape to promote best practices in data warehousing.
  • Extensive experience in the Extraction, Transformation, and Loading of data using Informatica from heterogeneous sources.
  • Strong ability to analyze source systems and business requirements, identify and document business rules, design data architecture for cohesive decision support, and prepare dimensional data models.
  • Extensive knowledge of Dimensional Data Modeling like Star and Snowflake schemas and knowledge in designing tools like Erwin and Power Designer.
  • Extensively worked in the development of Informatica Mappings and Informatica Workflows.
  • Experience wif Informatica Advanced Techniques & Dynamic Caching, Memory Management, and Parallel Processing to increase Performance throughput.
  • Experience in creating and using Stored Procedures, Functions, Triggers, Views, Synonyms, and Packages in SQL Server 2000/2005, Oracle 10g/9/8i, and DB2.
  • Involved in Performance/ Query tuning. Generation /interpretation of explain plans and tuning SQL to improve performance.
  • Skilled in Unix Shell Scripting and experience on different UNIX platforms.
  • Experience in the scheduling of ETL jobs using Crontab, Control-M.
  • Expertise in full life cycle Business Intelligence implementations and an understanding of all aspects of an implementation project using OBIEE 10.1.3.x/Siebel Analytics 7.x.
  • Experienced in gathering reporting and analysis requirements, documenting the report specifications, and implementing the metadata layers including Physical, Business Model and Mapping, and Presentation layer.
  • Experienced in migration of OBIEE reports between dev/test/prod environments.
  • Proficient in defining Key Performance Metrics (KPIs), facts, dimensions, hierarchies, and developing Star and Snowflake schemas.
  • Expertise in the design and development of the three layers (Physical/Business Model and Mapping/ Presentation) of an OBIEE Metadata Repository (.rpd) using the Oracle Bl Administration Tool.
  • Highly skilled at configuring OBIEE Metadata Objects including repository, variables, interactive dashboards, and reports.
  • Experienced in designing customized interactive dashboards in OBIEE using drill down, guided navigation, prompts, filters, and variables.
  • Expert in using OBIEE Answers to create queries, format views, and charts, and add user interactivity and dynamic content to enhance the user experience.
  • Experienced in working wif Variable Manager to define session and repository variables and initialization blocks to streamline administrative tasks and modify metadata content dynamically.
  • Proficient in Installation, configuration, and administration of the OBIEE platform.
  • Experienced in developing Dimensional Hierarchies, Level Based Measures, and adding multiple sources to business model objects.
  • Experienced in configuring and setting up OBIEE Security using LDAP and External Database Tables and configuring object level and database level security.
  • Extensively worked wif DBAs and BAs towards performance tuning in OBIEE and Data Warehouse environment using cache management, aggregate tables, and indexes.
  • Extensive experience in creating executive reports wif BI Publisher integrated wif OBIEE.
  • Experienced in creating interactive Ul Design (Siebel Answers, Delivers, and Dashboards) including experience in User Interface Design using CSS, HTML, XML, and Java Scripts.
  • Highly skilled in Data Analysis, Data Modeling, Dimensional Modeling, and Data Warehousing.
  • Experienced in creating Sessions, Workflows, Mappings, and Mapplets using various available transformations in Informatica, including Router, Aggregator, and Lookup.
  • Experience in creating and understanding Use Case Specifications and UML diagrams including Use Case Diagrams, Activity Diagrams, Sequence Diagrams, Class Diagrams, Data Flow Diagrams, and Flow Chart diagrams using Rational Rose and MS Visio.
  • Competencies include Business Analysis, by the RUP methodology, outlining system design and dependencies, writing functional and technical specifications, business requirements and system requirements documentation, Issue Resolution, and Defect Management.
  • Knowledge in developing reports using Business Intelligence tools like Business Objects and Cognos.
  • Maintained outstanding relationships wif Business Analysts and Business Users to identify information needs as per business requirements.
  • Experience in working in an onsite-offshore structure and effectively coordinating tasks between onsite and offshore teams.
  • Experience in full-cycle software development including requirements gathering, prototyping, proof of concept, design, documentation, implementation, testing, maintenance, and production support.
  • A highly motivated self-starter and a good team- player wif excellent verbal and written communication skills.

TECHNICAL SKILLS

  • Informatica
  • Talend
  • Unix (Shell Scripting)
  • OBIEE 10.1.3.
  • Siebel Analytics 7. Databases Oracle 9i/10g
  • MS SQL Server 200x
  • MS Access
  • TOAD
  • Oracle SQL Developer SQL
  • Teradata
  • Hive
  • Oracle
  • DevOps (bitbucket uDeploy
  • Artifactory etc.)
  • Office 365
  • Microsoft Visio
  • Data Governance
  • Data Protection
  • Data Profiling
  • Metadata Management and Data Lineage.
  • Power BI
  • Python (Basics)
  • Agile waterfall methodologies
  • Scheduling Tools like CA7
  • AutoSys

PROFESSIONAL EXPERIENCE

Confidential, Cleveland OH

Enterprise Warehouse Data Analyst

Responsibilities:

  • Requirement gathering and Analysis of STMs provided by Data Analysts, Data SMEs, and Transforming the English language to a technical language
  • Manually writing/preparing SQLs and procedures for the provided STMs to be used for ETLs in Informatica Power Center.
  • Developing ETLs to Transform the data and load the Target tables, build data marts as per the STM’s Provided.
  • Doing data profiling and cleansing as needed as per STMs and requirements using various Informatica tools Informatica Data Analyst and Data Profiling.
  • Good experience in integration of various data sources wif Multiple Relational Databases like Oracle, SQL Server, and Teradata and Worked on integrating data from non-relational sources like flat files, main frame files, and excel sheets.
  • Code review of all the components including Informatica ETLs, CA7 Scheduling, Unit Test Case, IQA, EQA documents, etc. developed by Peers wif the code review checklist.
  • Designing data warehousing ETLs to automate STMs by preparing high- and low-level design documents
  • Deployment of ETL code to various environments (Test, QA, and production) using DevOps tools - uDeploy, Bitbucket, etc.
  • Coordination wif the TCoE team and Business users for functional and user acceptance testing.
  • Production readiness activities such as documentation, Object inventory preparation, and Turnover meetings wif production support teams before deployment in production.
  • Performance optimization of the existing code based on requirements.
  • Perform, analyze and resolve defects, bug fixing, and defect prevention.
  • Resolving issues relating to technical and functional components and escalating matters requiring immediate attention to the immediate supervisors.
  • Daily status reporting over daily calls and by emails to Project Lead and weekly calls and status reports to Managers and Clients.
  • Responsible for Requirement Gathering Analysis and End-user Meetings
  • Responsible for Business Requirement Documents BRDs and converting Functional Requirements into Technical specifications
  • Responsible for providing the Integrated business areas in a single platform which was previously distributed across SAS, SQL Server, and Desktop applications.
  • Created the Entity Relationship Modeling for Product 2.0, for Child, Parent, Landing & Base tables for Informatica MDM tables.
  • Work wif Business Analysts in understanding the Nature of Data, accordingly, will get the best possible way to implement ETL for different Domains.
  • Responsible for creating Logical Data Objects (LDO) and profiling for multiple sources in Informatica developer and analyst.
  • Development and enrichment of Compliance documentation for existing and new application components.
  • Coordination wif the Offshore team for development-related activities.
  • Proof of concept (POC) on Ad-hoc requirements.
  • Developed technical specifications of the ETL process flow
  • Designed the Source - Target mappings and was involved in designing the Selection Criteria document.
  • Worked on the design and development of Informatica mappings, and workflows to load data into the staging area, data warehouse, and data marts in SQL Server and Oracle.
  • Used Informatica Power Center to create mappings, sessions, and workflows for populating the data into the dimension, fact, and lookup tables simultaneously from different source systems (SQL server, Oracle, Flat files).
  • Created mappings using various Transformations like Source Qualifier, Aggregator, Expression, Filter, Router, Joiner, Stored Procedure, Lookup, Update Strategy, Sequence Generator, and Normalizer.
  • Deployed reusable transformation objects such as applets to avoid duplication of metadata, reducing the development time.
  • Used version mapping to update the slowly changing dimensions to keep full history to the target database.
  • Created and Monitored Workflows using Workflow Manager and Workflow Monitor.
  • Used a Debugger to test the mappings and fixed the bugs.
  • Tuned performance of mapping and sessions by optimizing source, and target bottlenecks and implemented pipeline partitioning.
  • Worked wif DBA for partitioning and creating indexes on tables used in source qualifier queries.
  • Involved in Performance/Query tuning. Generation/interpretation of explain plans and tuning SQL to improve performance.
  • Involved in exporting the databases, tablespaces, and tables using a Data pump(10g) as well as traditional export/import (until 9i).
  • Scheduled various daily and monthly EL loads using Control-M
  • Involved in writing UNIX shell scripts to run and schedule batch jobs.
  • Involved in unit testing and documentation of the ETL process
  • Involved in Production Support in resolving issues and bugs.
  • Providing estimates on Transforming the data and building different data marts as per the Data Analyst’s needs.
  • Acted as a liaison between business users and the BI team to facilitate understanding of business requirements.
  • Created Requirements Traceability Matrix and Mapping Document to trace requirements in Bl implementation.
  • Conducted Functional Requirements review and walk-through wif Bl Architect, Designers, and Stakeholders.
  • Analyzed the data and developed the Logical Star Schema model wif consultation from Data Modelers and Business Analysts.
  • Developed metadata repository using OBIEE Administration tool in Physical, Business Model and Mapping, and Presentation Layer.
  • Created new logical columns, dimensional hierarchy, calculated measures, and aggregate mappings in the BMM layer as per the business requirements.
  • Developed Time Series Objects using Ago and To Date functions for time series comparisons.
  • Designed and developed various Interactive Dashboards and reports wif drill-downs, guided navigation, filters, and prompts.
  • Created various session and repository variables and initialized them in the Initialization Blocks to change metadata dynamically as the environment changes.
  • Worked towards the Performance Tuning of Dashboards and Reports wif Business Analysts and DBAs.
  • Implemented security based on LDAP Authentication.
  • Used various cache management techniques for performance tuning including configuring, defining cache parameters, cache persistence time, cache seeding and purging, and event polling.
  • Created security settings in OBIEE Administration Tool and set up groups, access privileges, and query privileges and also managed security for groups in Answers.
  • Integrated Bl Publisher wif OBIEE to build reports in word, excel, and doc format.
  • Created various templates, reports, and prompts in Bl Publisher.
  • Developed test cases and performed Unit Testing and Integration Testing of the Repository and Web Catalog.

Environment: Oracle 11gR2Client, Teradata 15.0, Informatica 10.1, Talend Data Integration 5.6.1, Tableau desktop/server 9.0/9.3, OBIEE 10.1.3.4, Bl Publisher 10.1.3.4, SQL, Toad 9.7, Windows XP, HTML, CSS., EAS & HPP 11.1.2.3, CA Software Change Manager, HP ALM, UNIX, SQL Server Management Studio, Vena System, Winscp, Endeavor- CA7.

Confidential, Pittsburgh, PA

ETL Developer

Responsibilities:

  • Development of ETL Mapping/session/workflows using Informatica, and Shell scripting for automated data quality on historical data for different source systems.
  • Performed data profiling to identify the pattern of data using the Informatica Analyst tool and Teradata Stored Proc.
  • Responsible for Requirement Gathering by connecting wif Business users & Business Analysts.
  • Responsible for Business Requirement Documents BRDs and converting Functional Requirements into Technical specifications
  • Responsible for providing the Integrated business areas in a single platform which was previously distributed across SAS, SQL Server, and Desktop applications.
  • Working very closely wif ETL Architect in identifying the different source systems & their nature of data. Created Architecture Diagrams for Different Source systems like PPNR, Mortgage, and Expense Models.
  • Responsible for identifying the Entities, Attributes & their Relationships to create an Entity relationship diagram.
  • Responsible for creating Logical & Physical data models through Erwin, Created the STAR dimensional Modeling for CCAR WS#4 like Dimension, Mapping, Reference, Stage, and Fact tables.
  • Work wif Business Analysts in understanding the Nature of Data, accordingly, will get the best possible way to implement ETL for different Domains.
  • Created Reusable applets to utilize the same process for all the PPNR & Mortgage Models for Different Source systems.
  • Extracted data from various heterogeneous sources like Oracle, SQL Server, Teradata, Flat Files & XML.
  • Responsible for implementing Re-usable Mapplets, transformations, Mappings, Emails, Command tasks, and Unix Scripts.
  • Responsible for creating Logical Data Objects (LDO) and profiling for multiple sources in Informatica developer and analyst.
  • Responsible for using Data Integration Hub (DIH) in creating topics and applications to publish and subscribe to data.
  • Involved in gathering business requirements and attended technical review meetings to understand the data warehouse model.
  • Involved in data modeling and design of data warehouse in star schema methodology wif confirmed granular dimensions and Fact tables.
  • Developed Technical Specifications of the ETL process flow.
  • Extensively used ETL to load data from Flat files, DB2, and Oracle into Oracle.
  • Used Informatica Designer to Extract &Transform the data from various source systems by incorporating various business rules. Also used different transformations, sessions, and command tasks.
  • Created mappings using different transformations like Aggregator, Expression, Stored Procedure, Filter, Joiner, Lookup, Router, and Update Strategy
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length, and target-based commit interval.
  • Using Informatica Repository Manager maintained all the repositories of various applications, created users, user groups, and security access control.
  • Developed shell scripts for job automation, which will generate the log file for every job.
  • Created Stored Procedures for Audit and Error Balancing.
  • Used Crontab for scheduling.
  • Validated Data to Maintain Referential Integrity.
  • Developed UNIX shell scripts to move source files to an archive directory.
  • Involved in Unit, Integration, system, and performance testing levels.
  • Worked on back-end programs such as PL/SQL procedures, functions, and packages.
  • Prepared a production monitoring and support handbook for ET Process.
  • Coordinated offshore and onsite teams wif a total of 7 ETL developers
  • Involved in the design and development of Reporting System using Business Objects.
  • Data governance application for Informatica MDM Hub dat enables business users to effectively create, manage, consume, and monitor master data using IDD (Informatica Data Director).
  • Develop numerous mappings using various transformations including Address Doctor, Association, Case Converter, Classifier, Comparison, Consolidation, Match, Merge, Parser, etc.
  • Have implemented SCD Type-1, Type-2, Truncate/Reload, and Incremental Loading mappings using Mapping Variables and Parameter Files.
  • Developed Unix Script for email notifications for notifying clients.
  • Preparing a Job flow diagram for executing the different jobs in the best possible way.
  • Migration of ETL code, Database components, and CA7 jobs to the higher environment using uDeploy, Bitbucket (git), Artifactory, etc.
  • Deployment of ETL code to various environments (Test, QA, and production) using DevOps tools - uDeploy, Bitbucket, etc.
  • Interacted wif Subject Matter Experts and Business Analysts for reporting requirements analysis and to define business and functional specifications.
  • Monitored and documented change requests in requirements and ensured modified requirements are met during Bl developer.
  • Monitored and documented change requests in requirements and ensured modified requirements are met during Bl development.
  • Developed metadata repository and configured metadata objects in all three layers using the Oracle BI Administration tool.
  • Build a repository by importing the data, defining keys and joins, creating a business model, defining complex joins, mapping columns, and sources, creating measures, and developing subject areas.
  • Developed various Reports, Interactive Dashboards wif drill-down capabilities, various charts and views, and tables using global and local Filters.
  • Developed Reports and Dashboards wif different Analytics Views including Pivot Tables, charts, Gauges, Column Selectors, and View Selectors wif global and local Filters using Oracle BI Presentation Services.
  • Used OBIEE Web Catalog to set up groups, access privileges, and query privileges.
  • Configured schedulers and used Bots to generate and deliver reports based on business requirements.
  • Created dimension hierarchies, aggregation criteria, and multiple logical sources in the BMM layer.
  • Involved in performance tuning of the dashboards wif BAs and BAs using cache management and aggregate tables.
  • Created schedules to seed and purge cache for cache management.
  • Implemented security by creating roles and web groups, and defined Object Level and Data Level Security.
  • Developer templates and reports in Oracle BI Publisher in RTF format.
  • Attended JAD and White Boarding Sessions wif Users and Business Analysts to keep business synchronized wif the progress.
  • Created Wireframes and Web-Mockups to give look and feel of Ul to end users.
  • Create Test Plans, Test Cases and performed unit and integration testing.
  • Performance optimization of the existing code based on the requirement.
  • Perform, analyze and resolve defects, bug fixing, and defect prevention.
  • Perform IQA on developed code and peer review.
  • Production readiness activities such as documentation, Object inventory preparation, and Turnover meetings wif production support teams before deployment in production.

Environment: Oracle 11gR2Client, Teradata 15.0, Informatica 9.5.1/9.6.1 , Informatica Power Exchange 9.5.1/9.6.1 , OBIEE 10.1.3.3, Informatica Power Center 8.1, Oracle 10g, OBIA 7.9. (Oracle Service Analytics), DAC, Toad, SQL, Tableau desktop/server 9.3, EAS & HPP 11.1.2.3, CA Software Change Manager, HP ALM, UNIX, SQL Server Management Studio, Vena System, Winscp, Endeavor- CA7.

Confidential

ETL Developer

Responsibilities:

  • Generating Daily PWR1 Outbound files to do rating calculations at PEGA Side
  • Work wif businesses and document Business problems and probable solutions to the problem.
  • Streamlined the ETL Code and implemented the required standards.
  • Have implemented the synonyms at the database level to maintain the downtime.
  • Carrying monthly loads in all the lower environments as per business needs.
  • Have created Power Exchange objects such as Data maps.
  • Responsible for implementing Re-usable Mapplets, transformations, Mappings, Emails, Command tasks, and Unix Scripts.
  • Experienced in creating Tableau Desktop reports, data blending, dual axis, and publishing in the server by providing the respective permissions, and adjusting the report specifications in higher environments as per business users.
  • Responsible for creating Logical Data Objects (LDO) and profiling for multiple sources in Informatica developer and analyst.
  • Responsible for using Data Integration Hub (DIH) in creating topics and applications to publish and subscribe to data.
  • Data governance application for Informatica MDM Hub dat enables business users to effectively create, manage, consume, and monitor master data using IDD (Informatica Data Director).
  • Develop numerous mappings using various transformations including Address Doctor, Association, Case Converter, Classifier, Comparison, Consolidation, Match, Merge, Parser, etc.
  • Responsible for Performance Tuning at the Source level, Target level, Mapping Level, and Session Level by implementing Push Down Optimization, Session level Partition, and Ordering the data in Lookup/Join queries.
  • Have created CA7 Jobs to schedule the ETL jobs using Endeavor.
  • Review scope and solution documents wif the business team on an immediate basis.
  • Seeks sign-offs and approval on scope and solution documents. All documents must be signed by the IT and Business Management team.
  • Keep solution and approach transparent to the Business and IT Management team. No solution should be implemented wifout approval from Business and IT management.

Environment: Oracle 11gR2Client, Teradata, Informatica 9.5.1/9.6.1 , Informatica Power Exchange 9.5.1/9.6.1 , Tableau desktop/server 9.3, EAS & HPP 11.1.2.3, CA Software Change Manager, HP ALM, UNIX, SQL Server Management Studio, Vena System, Winscp, Endeavor- CA7.

Confidential

ETL Developer

Responsibilities:

  • Understanding Business requirements.
  • Developing the ETL components as well as Oracle procedures, functions & triggers.
  • OFSAAI Data Entry Forms and Queries, Excel upload, Metadata Restore/Archive Utility, and Data model creation and upload process.
  • Reviewing components and test results prepared by other team members.
  • Experienced in debugging mappings. Identifying bugs in existing mappings by analyzing the data flow and evaluating transformations and emphasizing on Performance of mapping Optimizations.
  • Responsible for implementing Re-usable Mapplets, transformations, Mappings, Emails, Command tasks, and Unix Scripts.
  • Responsible for implementing SCD Type-1, Type-2, Truncate/Reload, and Incremental Loading mappings using Mapping Variables and Parameter Files.
  • Have created Tableau Desktop reports, data blending, dual axis, and publishing in the server by providing the respective permissions, and adjusting the report specifications in higher environments as per users.
  • Responsible for Performance Tuning at the Source level, Target level, Mapping Level, and Session Level by implementing Push Down Optimization, Session level Partition, and Ordering the data in Lookup/Join queries.
  • Experienced in writing Advanced/complex SQL Queries, PL/SQL Sequences, Procedures, Functions, Triggers, and Unix/Linux Scripting.
  • Carrying out pre-production implementation reviews of deliverables.
  • Responsible for implementing the Best ETL solution through Mapping logic, Testing, Migrating the code to a higher environment, Creating CA7 Jobs, and deploying the code into Prod.

Environment: Oracle 11gR2Client, Teradata, Informatica Power Center/Power Exchange 9.5.1, CA Software Change Manager, Endeavor, HP ALM, UNIX, SQL Server Management Studio, Winscp, CA7.

We'd love your feedback!