We provide IT Staff Augmentation Services!

Sr. Business Intelligence Engineer Analyst Resume

Costa Mesa, CA

PROFESSIONAL SUMMARY:

  • Technical Professional with eight plus years of Business Intelligence/ETL/Data Analytics experience in Credit, Finance, Healthcare, Corporate, and Retail domains
  • Strong foundation in Agile Software Development Methodology in both Scrum and Kanban frameworks enabling active participation in Agile ceremonies and continues delivery of features and functions
  • Adaptive team player who carries - through need based tasks like requirement analysis, technical design proposals, validation management, deployment management, root cause analysis, operational support, product research and process improvements, and business communications
  • Extensive experience in data acquisition, data cleansing, data profiling, data transformation, data-warehouse management, data visualization, publication and distribution of analytics, and BI/ETL related upgrades and migrations
  • Eight plus years of experience working with OLTP and OLAP Systems/Databases like SQL Server, Oracle, Netezza, DynamoDB, Redshift, Sybase, JD Edwards, Primavera, and other legacy systems
  • Eight plus years of experience working with reporting applications like Tableau, Power BI, Cognos, OBIEE, Business Objects, Microstrategy, and MS Excel Add-ins. This includes fresh reporting establishment, report development, user profiling, report/dashboard scheduling and distribution, reporting upgrade, and reporting migration. My report development experience includes Financial Reports, Product Life-cycle Reports, Call Center Reports, Operational Reports, Human Resources Reports, Payroll Reports, PMO Reports, etc…
  • Seven plus years of experience developing ETL/ELT/ELTL processes using Informatica Powercenter, SSIS, Oracle Data Integrator, etc. This includes extracting data from Relational Databases/files, implementing incremental loads, implementing SDC rules, data normalization, data replication, data archival, data quality improvement, process enhancement, Workflow scheduling and monitoring
  • Extensive Data analysis and testing experience of Web and API data, Application data, RDBMS data, Warehouse data, and Reporting data using Excel, Power-BI, R, and SAS
  • Strong understanding of Scrum, Project Management, Six-sigma, and ITIL concepts coupled with Immaculate documentation experience of Functional Requirement Document, Business Requirement Document, Project Timeline Document, Technical Design Document, Knowledge Article, Migration Document, Project Status Document, Responsibility Assignment Matrix, UAT Documentation, JIRA Documentation, and Confluence Documentation

TECHNICAL SKILLS:

Reporting Applications: Tableau 8.x/10.x, Power BI, Cognos 10.x/11.x, SSRS 2008/2012, OBIEE 10g/11g, BusinessObjects XIR2

Data Analysis Applications: R, SAS, SSAS, PowerPivot

ETL Applications: Informatica Powercenter 6.x/7.x/8.x/9.x, SSIS 2008/2012, Oracle Data Integrator 11g

Databases: MS SQL Server 2008/2012, Netezza, Redshift, DynamoDB, Oracle 8i/9i/10g/11g, Sybase, DB2

Languages: SQL, T-SQL, PL-SQL, R, DAX, Python, Ruby, C

Integration Applications: Git, Jenkins

Scheduling Applications: Control-M, Autosys, Oracle DAC

IDEs’: SQL Server Management Studio, Aginity Workbench, SQL Developer, TOAD, Rapid SQL, Atom, RubyMine

Operating Systems: Windows 2000/XP/7/8, Redhat Enterprise Linux 5.x/6.x

Applications(Web/Desktop): Erwin, Putty, SecureFX, SecureCRT, WinSCP, Splunk, Saucelabs, SourceTree, MS Office, SoapUI, Postman, JIRA, Rally, Confluence, Sharepoint, Appdynamics, TIBCO monitoring, Cherwell, Serena, Remedy

PROFESSIONAL EXPERIENCE:

Confidential, Costa Mesa, CA

Sr. Business Intelligence Engineer Analyst

Responsibilities:

  • Write Business Intelligence User Stories in Gherkin syntax to achieve 100% scope coverage of new incoming requirements. Layout delivery schedule, UAT Plan, deployment schedule, and review window to organize feature delivery and follow-ups
  • Review Wells Fargo’s reporting requirements and propose relevant enhancements to deliver top business value from Tableau based Reporting
  • Develop call center monitoring reports in Tableau desktop and publishing them to Tableau Server to develop Operational and Value Addition Dashboards having multiple metric based visualizations configured with live connection to Avaya’s SQL server based phone switch data
  • Develop Tableau Reports and Dashboards for Products, Compliance, Benefit Exceptions, Invoice, Operations, and PMO functions
  • Update Informatica Mappings/Sessions to implement enhancement features by updating source qualifier queries, expressions, lookup types, implementing indexes, commit intervals, transaction controls, introducing mapping/session variables, etc.
  • Debug web.config files, .net application code, ETL processes, and stored Procedures to analyze process behaviors like Bureau Enrollment, Alert Benefit Enrollment, Payment processing, Alert processing, Notification processing, contact recovery, and 3BQS and Score Monitoring benefits and to perform Root Cause Analysis
  • Develop Production-Exception-Review scripts in SQL Server leveraging transaction control, common table expressions, temporary variables, temporary tables, indexes, analytical functions, etc. to remediate process exceptions on monthly basis
  • Carry out data analysis of stratified marketing results for new products in R to regress county wise population attributes against enrollment and conversion behaviors to provide recommendations to both product and marketing teams
  • Develop Power BI Reports and Dashboards by implementing data relationships, custom queries, measures, page level filters, report level filters, DAX, daily data refresh, content packs, user group profiling, etc. to provide KPIs’ like Conversion to Cost of Acquisition, customer engagement to marketing methods, etc. by consuming data-files and RDBMS data
  • Extract and analyze data from materialized views in Amazon Redshift for new products
  • Lead/participate in weekly deployment cycles and co-ordinate with Build Engineers, Application Engineers, Batch Control Engineer, and Reporting Engineers

Environment: SQL Server 2012, Netezza, Dynamo DB, Informatica 9.1.0/9.5.1 , Tableau 10.x, Power BI, Cognos, .net ramework, Control-M, Splunk, Appdynamics, TIBCO monitoring, Putty, WinSCP, SSMS, Rapid SQL, Toad, Cherwell, Serena, PostMan, SOAPUI, Ruby, Python, JIRA, Rally, Saucelabs, Git, SourceTree, and Jenkins

Confidential, Irvine, CA

Sr. Applications Analyst Data-warehouse

Responsibilities:

  • Identify enhancement features and data-flow processes by collaborating with application owners, application developers, and third party vendors
  • Create Agile stories in Given-When-Then format and Use-case documents for enhancement features
  • Coordinate database designs and ETL layout with data-warehouse team to facilitate data delivery for enhancement features
  • Initiated and delivered KPI - Information Ratio to the account summary page for all active investment accounts in MARS application for accounts reporting
  • Initiated and delivered KPI - IR Tstat to the investments performance metrics for all historical investments in Investments Reports to gauge Portfolio Managers’ decisions
  • Deliver an Excel based conversion app to facilitate cash reconciliation between Portfolio Accounting and Manager Accounting applications
  • Perform Problem investigation for job failures arising from server, network, data-file, or Vendor Issues and work towards its resolution
  • Compliance implementation of KYC, Risk Profiling, Foreign Assets and Vendor Management/Client On-boarding
  • Coordinate with application owners to configure command, box, and file-watcher Autosys jobs and analyze its Atomicity and Dependency structure
  • Code-review related SQL server developments, Informatica workflows, and Autosys jobs post development and trigger dry-runs of Database/ETL/Autosys development in lower environments to identify data/performance bottlenecks
  • Write SQL queries to extract relevant data and perform analysis and design virtual reports from the analyzed results
  • Perform monthly incident analysis in Remedy to identify and investigate frequently occurring Autosys job-failures and recommend remediation
  • Forced Start, Forced Success, On hold, Off hold, On Ice, Off Ice SENDEVENT commands for On-demand and Production Autosys Jobs
  • Configure de-listed and newly-launched equity scripts/bonds/ETFs, Support Data-mods on data/parameter files to remediate data issues, manually SFTP data files in case of FTP failures, BCP data to database tables for restoration/backup/data-analysis, trouble-shoot batch job failures using Application/Server/Informatica/Splunk logs, and support data warehouse maintenance and outage communications

Environment: SQL Server 2012, Oracle 11g, Sybase, Informatica 9.1.0/9.5.1 , Microstrategy, .net Framework, Autosys, Splunk, Appdynamics, TIBCO monitoring, SecureCRT, SecureFX, SSMS, Rapid SQL, Toad, Remedy

Confidential, Costa Mesa, CA

Sr. Business Intelligence Developer

Responsibilities:

  • Review business requirements and propose technical design and approach to achieve code integrity and delivery of business value
  • Develop new trending reports/data extracts/exception reports/SLA reports/interactive dashboards, introduce new visualizations, and enhance existing KPIs’ in Cognos 10 using Administration, Connection, Framework Manager, Report Studio, Query Studio, Analysis Studio, Event Studio, etc. and update and build Cognos Cubes
  • Develop Informatica Powercenter mappings and workflows for new requirements like Regulation-E, SLA Tracking, etc.
  • Update 100+ Informatica mappings and workflows to implement SQL server to Netezza migration
  • Data Validation of 250+ Dimensions, Facts, and Aggregate tables between SQL server and Netezza warehouses to identify developmental gaps in the Netezza migration projects
  • Perform Root Cause Analysis of process anomalies by analyzing transactional and warehouse data
  • Create Production Exception Review scripts to remediate benefit delivery exceptions like 3B-Quarterly Update, Monthly Score Monitoring, Correspondence, Refunds, etc.
  • Create replication stored procedures to continually replicate data from transactional tables to both staging and data-warehouse tables
  • Extract Customer Level Resolution (CLR) data for customer Journey tracking at different granularity for business analysis
  • Develop Data extract SQL scripts that runs against SQL Server or Netezza to provide data for ad-hoc business requests
  • Participated in Monthly Application and database Server Patching activities for Cognos/Informatica and Netezza
  • Designed Test-plans based on project time-line, developed Test-cases based on use-cases requirements, created Test-data to test organic and edge-case scenarios, drive UAT sessions for all Data-warehouse/ETL/BI developments and enhancements
  • Participated as a BI resource in multiple projects for affinity partners like USAA, AAA, Morgan Stanley, Discover, and American Express
  • Data conversion, exception handling, data cleaning, and data matching using Informatica Data Quality(IDQ), also validate correspondence data using IDQ to eliminate correspondence Failures
  • Document Runbooks for ETL/Reporting jobs scheduled from Control-M and create technical mapping documents and test evidence documents
  • Participated in Monthly Application and database Server Patching activities for Cognos/Informatica and Netezza
  • Participated in Grooming, planning, stand-up, retro sessions for Agile practice

Enviornment: SQL server 2008, Informatica Powercenter 9.5.1, IDQ, Linux, Netezza, Cognos 10.4/11, Control-M V8x, SSIS, Aginity Workbench, AWS, Git, Jenkins, Cucumber, Rally, SSMS, Email Genie, putty, WinSCP, Serena TeamTrack, Quality Center, SourceTree, SauceLabs, QTP

Confidential, Long Beach, CA

Sr. ETL Developer

Responsibilities:

  • Requirement gathering for implementation of Accounts Payable, General Ledger, and Budget reports in OBIEE
  • Designed project timeline, reviewed project progress and delivered projects involving efforts worth 60 days or less
  • Reviewed ETL logic for OBIA HR-Analytics history tracking cube
  • Developed mapplets, mappings, tasks and workflows involving complex dependency structures in Informatica 8.6.1 and 9.1.0
  • Tuned stored procedures for performance improvement of SQL scripts
  • Implemented full upgrade life cycle for OBIEE Financial Analytics module including, configuring source and lookup files, configuring DAC/runtime/mapping/session parameters, customizing ETL logic, designing Execution plan, Report migration, Unit testing, and deployment
  • Reviewed ETL logic for FT, AA, PM, and BA reports of Financial Analytics module of OBIA
  • Re-structured security profiles to establish access levels for Users
  • Co-ordinated with Oracle to resolve performance issues/failures for out of box reports and worked on performance improvement of custom reports in OBIEE 11g
  • Developed ETL and Business layer logic for New Hires, Promotions, headcount, annual rewards, compensations, history tracking and termination reports of Human Resource Analytics of OBIA
  • Deployed Resource Allocation, Resource Spread, Activity Spread, Resource Utilization, Time tracking, Actuals, Budget and Earned Value reports of P6 Analytics of OBIA
  • Performed mapping level data validation using Informatica Powercenter Data Validation Option (DVO), created Audit trails to maintain and track validation success
  • Converted Informatica 9.1 ETL to ODI ETL for OTBI implementation of HCM module of ERP systems
  • Structured Execution plan in Data-warehouse Administration console for efficient execution of nightly jobs
  • Develop ODI based ETL designs for HCM module of HR Analytics
  • Validate employee hierarchies in Informatica MDM
  • Validate data between Data Warehouse and different Application Source Systems like JD Edwards, Enterprise Project Portfolio Management, Kenexa, Kronos, and Coupa
  • Monitor and resolve production support issues in a multi priority and fast-paced environment
  • Designed test cases to validate new implementations and owned validation history management
  • Developed ICTC, RACI, BRD, FRD, Process flow, Technical Specifications, Data Lineage, STTM and CMD documents
  • Developed weekly ticket report for the Asst. Vice President of Corporate Systems department
  • Designed, updated and maintained Business Intelligence Team’s Sharepoint site

Enviornment: Informatica Powercenter 8.6.1/9.1.0 , ODI, MDM, IDQ, Salesforce, OBIEE 10g/11g, Oracle 10g/11g, MS SQL Server 2008, DAC 10g/11g, Crontab, Windows, Linux, and SQL Server Management Studio(SSMS)

Confidential, Stamford, CT

Sr. ETL Developer

Responsibilities:

  • Responsible for support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center
  • Imported from various Sources such as Excel, Flat files, to Targets, and developed Transformations using Informatica Power Center Designer for testing
  • Performed data modeling to enable reporting segregation
  • Validated various Mappings involving sales strategies with the collection of all Sources, Targets, and Transformations
  • Worked with Informatica Data Quality (IDQ) to obtaining desired and efficient results
  • Create and maintain metadata and ETL documentation that support business roles and detailed source to target data mappings
  • Perform data validation for Oracle to Teradata migration, for Sales database
  • Developed and maintained technical documentation namely database design document regarding the extract, transformation and load process
  • Designed and developed complex Aggregate, Join, Router, Look up, XML and Update Strategy transformation rules (business rules).
  • Implemented mapping level optimization with best route possible without compromising with business requirements
  • Worked on implementing BDE for Informatica 9.1.0
  • Wrote PL/SQL Packages for Inventory package, procedures, functions in oracle for business rules conformity
  • Design use cases for Informatica to Ab Initio ETL logic migration
  • Utilized SQL loader, export/import utilities for data load and transformation in Oracle based ‘Latin-AM-operations datamart’
  • Developed PL/SQL stored procedures for source pre-load and target pre-load to verify the existence of tables.
  • Functional and Business Process testing using HP UFT and HP BPT respectively
  • Worked with different databases such as Oracle and Flat files and used Informatica to extract data.
  • Involved in Unit Testing, Integration, and User Acceptance Testing of Mappings
  • Performed Data Profiling to assess the risk involved in integrating data for new applications, including the challenges of joins and to track data quality standards
  • Used Informatica Web Services to extract severity vice pending request reports for control center
  • Extensively worked with Business Intelligence team in preparing and delivered a variety of interactive and printed reports by using Crystal Reports and Business Objects

Environment: Informatica Power Center 9.1/8.6, IDQ, Oracle 10g/11g, Teradata, UNIX, Shell Scripts, TOAD 10.6, Abinitio, Control-M 6.4, ERwin7.3, HP Quality Center 10.0, and SQL Server Management Studio(SSMS)

Confidential, Atlanta, GA

Sr. Informatica Developer

Responsibilities:

  • Gathered Business requirements by conducting meetings with business analyst, stake holders, development teams and data analysts on a scheduled basis for data integration process
  • Identified documented data sources and transformation logic required to populate data and maintain targets
  • Extract and load source data to severity datamart using Informatica
  • Implemented transformation logic to support ETL processes to load data into staging database using Informatica power center
  • Played key role in determining data feed extraction process, data analysis, and testing and project coordination
  • Applied transformation logics such as XML transformation, lookup transformations, Update strategy transformation, expression transformation, joiner transformation etc. to extract, transform and load data into target
  • Extensive development with Look up Caches like Persistent Cache, Static Cache, and Dynamic Cache to improve the performance of the lookup transformations
  • Designed re-useable transformations to quickly add new data sources and transformations related to business needs
  • Worked with SQL Stored Procedures and converted the logic to put in Informatica to integrate the data from different databases
  • Developed the Documentation for all the mappings/workflows, and also developed the run books which were extensively helpful to the Production support team for all the recovery situations
  • Identify performance bottlenecks in sources, mappings, workflow session and used best practices for performance tune of bottlenecks
  • Utilized Autosys scheduler to run sessions and workflows in batch processes
  • Coordinated in writing test plans to perform various test cases for the validity of data

Environment: Informatica Power Center 8.6.1/9.x, Oracle 10g, SQL server 2005 &2008, PL/SQL, SQL*Plus, Windows NT, Unix Shell Scripting, Erwin 4.

Confidential

Systems Engineer (DB Professional)

Responsibilities:

  • Worked with business users to define and analyze problems and business needs by involving in sessions with the analysts
  • Established data standards for customer information, including data definitions, component structures (such as for complex data types), code values, and data use
  • Created a bridge table for the numerous dimension groups and related them accordingly
  • Developed enterprise-wide framework that defines how multiple sources of data should be consolidated into a single, structure that enables the use of creative business analytics to extract useful information from a large data sources
  • Applied reusable tasks and work lets in Informatica that can be helped for easier design of workflows to implement the transformations that help movement of data from diversified sources to the target
  • Tuned performance of Informatica Session by increasing block size, data cache size, sequence buffer length and Target based commit interval, and mappings by dropping and recreation of indexes
  • Performed Unit testing during the mapping phase to ensure proper and efficient implementation of the transformations
  • Used mapping parameters to extract the required data from the sources and direct the data to the sources
  • Written UNIX Shell Scripts and Pmcmd command line utility to interact with Informatica Server from command mode

Environment: Informatica Power Center 7.1.2, Oracle 9i, AIX 5.3, Toad 8.5, ERwin 4.1, Windows XP.

Hire Now