We provide IT Staff Augmentation Services!

Sr. Business Data Analyst Resume

Plano, TX

SUMMARY:

  • Over 7 years of IT experience in the field of Data/ Business analysis, ETL Development, Master Data Management, Data Modeling, and Project Management.
  • Strong experience in Business and Data Analysis, Data Profiling, Data Migration, Data Conversion, Data Quality, Data Governance, Data Integration and Metadata Management Services and Configuration Management.
  • Over 2+ years of experience in Data Modeling with expertise in creating Star & Snow - Flake Schemas, FACT and Dimensions Tables, Physical and Logical Data Modeling using Erwin and Embarcadero.
  • Ability to collaborate with peers in both, business, and technical areas, to deliver optimal business process solutions, in line with corporate priorities.
  • Knowledge in Business Intelligence tools like Business Objects, MSBI, Cognos and OBIEE.
  • Experience with Teradata as the target for the data marts, worked with BTEQ, Fast Load and MultiLoad.
  • Strong experience in interacting with stakeholders/customers, gathering requirements through interviews, workshops, and existing system documentation or procedures, defining business processes, identifying and analyzing risks using appropriate templates and analysis tools.
  • Experience in various phases of Software Development life cycle (Analysis, Requirements gathering, Designing) with expertise in documenting various requirement specifications, functional specifications, Test Plans, Source to Target mappings, SQL Joins.
  • Experience in conducting Joint Application Development (JAD) sessions for requirements gathering, analysis, design and Rapid Application Development (RAD) sessions to converge early toward a design acceptable to the customer and feasible for the developers and to limit a project’s exposure to the forces of change.
  • Gathered and documented MDM application, conversion and integration requirements.
  • Worked in Agile Methodologies for the implementation of the project.
  • Experience in coding SQL/PL SQL using Procedures, Triggers and Packages.
  • Good understanding of Relational Database Design, Data Warehouse/OLAP concepts and methodologies.
  • Worked on different platforms such as Windows 95/windows/98/NT and UNIX, Sun Solaris, AIX, HP
  • Experience in Working with different industries like Financial, Media, Retail and Banking.
  • Experienced in CCAR Regulatory Reporting and CCAR Stress Testing for Banking Balance Sheet Asset management
  • Interfaces with members, peers, management and others to provide excellent internal/external customer service in specialized area of regulatory responsibility
  • Experience with MDM Hub configurations - Data modeling & Data Mappings, Data validation, cleansing, Match and Merge rules.
  • Created Functional Decomposition diagram to represent the data management activities.
  • Insight over the design and development of Tableau visualizations which includes Preparing Dashboards using calculations, parameters, calculated fields, groups, sets and hierarchies.
  • Good knowledge on Developing, Designing and supporting interactive Tableau Dashboard reports
  • Experience in creating functional/technical specifications, data design documents based on the requirements.
  • Responsible for quality control activities including Test Case management, designing Test execution suite and defect cycle management, test case designing and execution
  • Creation of a Text Analysis Library in Rtool and python having multiple generic as well as customized applications
  • Have good understanding on object data model of SalesCloud related objects such as Opportunities, Quotes, QuoteLineItems, Products and Pricebooks.

TECHNICAL SKILLS:

Languages: Postgres SQL, T-SQL, Pl/SQL, R, Python

Data Visualization: Tableau

Data Analysis: SAS, Rtool

Modeling Technique: Predictive Modeling/ANOVAs/Linear Regression, Logistics Regression/Cluster analysis

Web Technologies: HTML, DHTML, JAVA Script, XML

Databases: MS SQL Server (2012/2008, 2005/ 2000/ 7.0 ), MS-Access, Oracle 10g

Reporting: SQL Server Reporting Services (SSRS), Tableau, CCAR

Process/Model Tools-: MS Office, MS Project,Visio, SharePoint 2013, Rational Rose,RequisitePro, Clearcase, Clearquest,MS Excel,MS Power Point,MS Word professional

PROFESSIONAL EXPERIENCE:

Confidential

Sr. Business Data Analyst

Responsibilities:

  • Collaborated with business process owners to define requirements, design and present solutions to meet the requirements accurately. Provided strategical recommendations based on analysis of the data, business situation, and moderate knowledge of systems.
  • Conducted cost benefit analysis and identified profitable advertising mediums to target customers. Used clusters to build predictive models relying on Random Forest, logistic regression and classification trees to forecast lead conversion and perceive factors that have direct impact on the Bank’s marketing efforts.
  • Extract, transform, load, process, evaluate and maintain integrity of client-provided data.
  • Analyzed large data sets to provide strategical direction to the Bank. Worked with various segments of banking (Loans, Mortgages, Debit Cards, Prepaid Cards, Credit Cards, Deposits, Insurance, Plans, and Marketing etc.) to generate a new detailed General Ledger file for each source system. Also, worked on determining Household Expenses, Product Pricing, Credit Discrepancies, Risk exposure for Loan/Mortgages and prepared detailed risk analysis report and impact analysis documentation.
  • Created SQL jobs to auto generate Daily Schedule Reports, Monthly Run and Quarterly Run of Reports according to the requirements defined. Involved in writing complex queries, building packages and reports to support different applications.
  • Creation of Requirements and Functional Design Documents detailing how the data related to assigned projects should be transformed for the Data Warehouse, stored within the Data Warehouse, and transformed for the downstream applications.
  • Create and/or maintain high-level reporting systems using SQL or other ETL tools. Troubleshoot and support various MS Access/SQL databases developed and utilized by the team to ensure optimal performance and data reliability at all times.
  • Documented various Data Quality mapping document, audit and security compliance adherence.
  • Played a key role in conducting user acceptance testing with Line of Business, Stakeholders and downstream users by acting as a middle layer between the Technical team and the Business users.
  • Perform Data Analysis inclusive but not limited through the utilization of SQL, DataStage, Mainframes, BlueZone Secure FTP, BO Universe etc.,
  • Perform some aspects of project management including but not limited to approval gathering, readiness assessments, and other artifacts and controls.
  • Participate and present ideas for system improvements in Business Working team reviews with stakeholders/LOB of the project and discuss improvements.

Environment: Cardett AQT, IBM InfoSphere DataStage, IBM DB2DBCOPY1, BlueZone Mainframe, Business Objects XI 3.1, Blue Zone Secure FTP, PUTTY, Oracle Data Mining, SAP Business Intelligence, ERWIN r9.7, Microsoft Suite, Collibra, InfoSphere Information Governance Catalog, SharePoint, Application Lifecycle Management (ALM) Quality Center, BMC Remedy IT Service Management, MDM Analytics.

Confidential, Plano, TX

Sr. Data Analyst

Responsibilities:

  • Work closely with MDM Informatica Architects to fully understand and “read” ETL processes and process flows for business requirements; understand the business rules that are coded within the ETL load processes.
  • Involved in defining the source to target data mappings, business rules, and business and data definitions.
  • Responsible for defining the functional requirement documents for each source to target interface.
  • Document, clarify, and communicate requests for change requests with the requestor and coordinate with the development and testing team.
  • Coordinated meetings with vendors to define requirements and system interaction agreement documentation between client and vendor system.
  • Remain knowledgeable in all areas of business operations in order to identify systems needs and requirements.
  • Worked with internal architects and, assisting in the development of current and target state data architectures
  • Worked with project team representatives to ensure that logical and physical ER/Studio data models were developed in line with corporate standards and guidelines
  • Generate weekly and monthly asset inventory reports.
  • Document data quality and traceability documents for each source interface
  • Consolidating data from multiple sources in support of campus wide decision making and related information needs such as reporting, analysis, and planning (EDW).
  • Configured MDM tables in the following sequence: Base Objects, Relationships Base Objects, and Defined and developed the ETL process to load the data into the landing tables during the land process
  • Created SQL server Analysis Projects and created Multi-dimensional databases and OLAP Cubes and dimensions for the analysis of sales in specific areas using SSAS.
  • Experience with Vertica as the target for the data marts (tables).
  • Worked with Postgres SQL for creating sample data and created process flow diagrams using Visio.
  • Document various Data Quality mapping document, audit and security compliance adherence.
  • Work with the business and the ETL developers in the analysis and resolution of data related problem tickets.
  • Experience in Microstrategy for pulling out the reports during mapping analysis and also assisted in building reports for inventory and sales/polling.
  • Deployed analysis projects using SSAS based on data volume, as well as developed of KPI’s which helped the Business users for easy understanding.
  • Participate in developing data governance standards for data and data management activities for all respective groups.
  • Implemented complex discounting structures using the SalesCloud module in SalesForce.com.
  • Configured product in bundle pricings with tier support discounts for various customers such as distributors, resellers.
  • Experience in designing and developing Data Warehouse applications using ETL and Business Intelligence tools like Informatica Power Center (9.x/8.1/7.x/6.x/5.x) /Power Mart 4.7, Datastage, Mainframes SAS, SSAS, SSIS, OLTP, OLAP, Business Objects.
  • Experience in coding SQL/PL SQL using Procedures, Triggers and Packages.
  • Worked on Advanced Excel for Data Mapping and Data Analysis Reports.
  • Extensive experience in HP ALM administration module and performed QA duties.
  • Involved in data quality - testing the code generated by the developers and executed/evaluated manual or automated test cases and reported test results .

Environment: Erwin r9, PostgreSQL, Server 2008, UNIX AIX version 6, HP Vertica Analytics Platform 6.1.x, MicroStrategy desktop V 9.3.1, OBIEE, MS Excel, MS Visio 2013, TOAD, SQL, PL/SQL, Rally, QuickBase, Talend, Windows, full copy sandbox, Salesforce Schema Builder, Data Loader, Salesforce Workbench.

Confidential, LA

Sr. Data Analyst

Responsibilities:

  • Involved in defining the business/transformation rules applied for sales and service data.
  • Define the list codes and code conversions between the source systems and the data mart.
  • Worked with internal architects and, assisting in the development of current and target state data architectures
  • Worked with project team representatives to ensure that logical and physical ER/Studio data models were developed in line with corporate standards and guidelines
  • Involved in defining the source to target data mappings, business rules, business and data definitions
  • Responsible for defining the key identifiers for each mapping/interface
  • Responsible for defining the functional requirement documents for each source to target interface.
  • Document, clarify, and communicate requests for change requests with the requestor and coordinate with the development and testing team.
  • Coordinated meetings with vendors to define requirements and system interaction agreement documentation between client and vendor system. the Enterprise Metadata Library with any changes or updates
  • Document data quality and traceability documents for each source interface
  • Establish standards of procedures.
  • Experience with Teradata as the target for the data marts, worked with BTEQ, Fast Load and MultiLoad.
  • Evaluated data profiling, cleansing, integration and extraction tools (e.g. Informatica)
  • Coordinate with the business users in providing appropriate, effective and efficient way to design the new reporting needs based on the user with the existing functionality
  • Remain knowledgeable in all areas of business operations in order to identify systems needs and requirements.
  • Consolidating data from multiple sources in support of campus wide decision making and related information needs such as reporting, analysis, and planning (EDW).
  • Produced PL/SQL statement and stored procedures in DB2 for extracting as well as writing data
  • Developed PL/SQL stored procedures for the end-user report requirements
  • Worked with Postgres SQL for creating sample data and created process flow diagrams using Visio.
  • Responsible for defining the key identifiers for each mapping/interface
  • Implementation of Metadata Repository, Maintaining Data Quality, Data Cleanup procedures, Transformations, Data Standards, Data Governance program, Scripts, Stored Procedures, triggers and execution of test plans
  • Creation of a Text Analysis Library in Rtool and Python having multiple generic as well as customized applications
  • The data was cleaned and run through Rtool to analyze and predict the change in batch inflow for each customer and its effect on environment.
  • In order to see accurate information performed data quality analysis and validation, conduct data mining to identify trends and solutions to improve quality of Clinical data . Collect, maintain, manipulate, query, research, scrub and ensure data accuracy and integrity.
  • Configured product in bundle pricings with tier support discounts for various customers such as distributors, resellers and end users using CPQ in SalesCloud.
  • Performed load and stress testing to benchmark the performance of the service Technologies or Technical tools used: Rtool, Rserve, java, python, docker containers

Environment: SQL/Server, PL/SQL, Postgres SQL, Oracle 9i, MS-Office, Teradata, Informatica, ER Studio, XML, Business Objects, Rtool, Python, UNIX, full copy sandbox, Salesforce Schema Builder, Data Loader, Salesforce Workbench.

Confidential, LA

Sr. Data Analyst

Responsibilities:

  • Provided Treasury and Cash Management support to VP of Finance, Assistant Treasurer and Accounts Payable Department
  • Work with the Project Management in the creation of project estimates.
  • Provide regulatory compliance risk expertise and consulting for projects to identify and mitigate regulatory compliance risk
  • Analysis of the data identifying source of data and data mappings of HCFG.
  • Worked extensively in documenting the Source to Target Mapping documents with data transformation logic.
  • Interact with the SME’s to analyze the data extracts from Legacy Systems (Mainframes and COBOL Files) and determine the element source, format and its integrity within the system
  • Transformation of requirements into data structures which can be used to efficiently store, manipulate and retrieve information
  • Collaborate with data modelers, ETL developers in the creating the Data Functional Design documents.
  • Ensure that models conform to established best practices (including normalization rules) and accommodate change in a cost-effective and timely manner.
  • Enforce standards to ensure that the data elements and attributes are properly named,
  • Work with the business and the ETL developers in the analysis and resolution of data related problem tickets.
  • Support development teams creating applications against supported databases.
  • Perform small enhancements (SOR element additions, data cleansing/data quality).
  • Create various Data Mapping Repository documents as part of Metadata services (EMR).
  • Provide inputs to development team in performing extraction, transformation and load for data marts and data warehouses.
  • Developed large number of standard and customized reports using SSRS for data feeds and for Reports on server.
  • Involved in Creating and Testing the SSIS packages developed for security.
  • Modified PL/SQL scripts to validate and load data into interface tables. The backend was in Oracle and database operations were handled using stored procedures
  • Developed PL/SQL stored procedures for the end-user report requirements
  • Actively involved in developing Complex SSRS Reports involving Sub Reports, Matrix/Tabular Reports, Charts and Graphs
  • Produced PL/SQL statement and stored procedures in DB2 for extracting as well as writing data
  • Ensure that models conform to established best practices (including normalization rules) and accommodate change in a cost-effective and timely manner.
  • Enforce standards to ensure that the data elements and attributes are properly named,
  • Work with the business and the ETL developers in the analysis and resolution of data related problem tickets.
  • Consolidating data from multiple sources for EDW.
  • Using SharePoint , created a new site and inserted an Excel Dashboard for SharePoint users
  • Added and configured SharePoint lists, sites, and created custom forms
  • Configured SharePoint Site Administrating Settings Support development teams creating applications against supported databases.
  • Document various Data Quality mapping document, audit and security compliance adherence.
  • Perform small enhancements (SOR element additions, data cleansing/data quality).
  • Create various Data Mapping Repository documents as part of Metadata services (EMR).
  • Strong background in CCAR Regulatory Reporting especially in FRY 9C, FRY 14Q, FRY 14A
  • Provide inputs to development team in performing extraction, transformation and load for data marts and data warehouses.
  • Provide support in developing and maintaining ETL processes that extract data from multiple SOR’s residing on various technology platforms then transport the data to various delivery points such as data marts or data warehouses.

Environment: SQL Server 2005/2008 Enterprise Edition, CCAR, FR-Y 9C, FR-Y 14Q, PPNR, FR-Y 11, FR-Y 9LP, FFIEC 031, FFIEC 041, FFIEC D09, SSIS 2005, MS Excel, MS Access, SharePoint 2013, Oracle 10g, UNIX, Windows XP, SQL, PL/SQL, Power Designer, Informatica

Confidential, Minneapolis, MN

Sr. Data Analyst

Responsibilities:

  • Document the complete process flow to describe program development, logic, testing, and implementation, application integration, coding.
  • Involved in defining the business/transformation rules applied for sales and service data.
  • Define the list codes and code conversions between the source systems and the data mart.
  • Consolidating data from multiple sources in support of campus wide decision making and related information needs such as reporting, analysis, and planning (EDW).
  • Worked with internal architects and, assisting in the development of current and target state data architectures
  • Worked with project team representatives to ensure that logical and physical ER/Studio data models were developed in line with corporate standards and guidelines
  • Experience with Teradata as the target for the data marts, worked with BTEQ, Fast Load and MultiLoad.
  • Involved in defining the source to target data mappings, business rules, business and data definitions
  • Worked with BTEQ to submit SQL statements, import and export data, and generate reports in Terra-data.
  • Collaborate with data modelers, ETL developers in the creating the Data Functional Design documents.
  • Responsible for defining the functional requirement documents for each source to target interface.
  • Document, clarify, and communicate requests for change requests with the requestor and coordinate with the development and testing team.
  • Coordinated meetings with vendors to define requirements and system interaction agreement documentation between client and vendor system.
  • Work with the business and the ETL developers in the analysis and resolution of data related problem tickets.
  • Enterprise Metadata Library with any changes or updates
  • Document data quality and traceability documents for each source interface
  • Establish standards of procedures.
  • Generate weekly and monthly asset inventory reports.
  • Evaluated data profiling, cleansing, integration and extraction tools (e.g. Informatica)
  • Coordinate with the business users in providing appropriate, effective and efficient way to design the new reporting needs based on the user with the existing functionality
  • Remain knowledgeable in all areas of business operations in order to identify systems needs and requirements.
  • Responsible for defining the key identifiers for each mapping/interface
  • Implementation of Metadata Repository, Maintaining Data Quality, Data Cleanup procedures, Transformations, Data Standards, Data Governance program, Scripts, Stored Procedures, triggers and execution of test plans
  • Performed data quality in Talend Open Studio

Environment: SQL/Server, Oracle 9i, MS-Office, Teradata, Informatica, ER Studio, XML, Business Objects

Confidential

Responsibilities:

  • Plan, design, and implement application database code objects, such as stored procedures and views.
  • Build and maintain SQL scripts, indexes, and complex queries for data analysis and extraction.
  • Created ad-hoc reports for the upper level management using Stored Procedures and MS SQL Server 2005 Reporting Services (SSRS) following the business requirements.
  • Created reports by dragging data from cube and wrote mdx scripts.
  • Created reports by extracting data from cube.
  • Generated reports using SQL Server Reporting Services 2005/2008 from OLTP and OLAP data sources.
  • Provide database coding to support business applications using Sybase T-SQL.
  • Perform quality assurance and testing of SQL server environment.
  • Develop new processes to facilitate import and normalization, including data file for counterparties.
  • Work with business stakeholders, application developers, and production teams and across functional units to identify business needs and discuss solution options.
  • Ensure best practices are applied and integrity of data is maintained through security, documentation, and change management.

Environment: SQL Server 2005 Enterprise Edition, T-SQL, Enterprise manager, VBS.

Hire Now