We provide IT Staff Augmentation Services!

Sr. Data Engineer Resume

0/5 (Submit Your Rating)

TX

SUMMARY

  • Over 7 years of IT experience in the field of Data/ Business analysis, ETL Development, Master Data Management, Data Modeling, and Project Management.
  • Strong experience in Business and Data Analysis, Data Profiling, Data Migration, Data Conversion, Data Quality, Data Governance, Data Integration and Metadata Management Services and Configuration Management.
  • Over 2+ years of experience in Data Modeling with expertise in creating Star & Snow - Flake Schemas, FACT and Dimensions Tables, Physical and Logical Data Modeling using Erwin and Embarcadero.
  • Knowledge in Business Intelligence tools like Business Objects, MSBI, Cognos and OBIEE.
  • Experience with Teradata as the target for the data marts, worked with BTEQ, Fast Load and MultiLoad.
  • Strong experience in interacting with stakeholders/customers, gathering requirements through interviews, workshops, and existing system documentation or procedures, defining business processes, identifying and analyzing risks using appropriate templates and analysis tools.
  • Experience in various phases of Software Development life cycle (Analysis, Requirements gathering, Designing) with expertise in documenting various requirement specifications, functional specifications, Test Plans, Source to Target mappings, SQL Joins.
  • Experience in conducting Joint Application Development (JAD) sessions for requirements gathering, analysis, design and Rapid Application Development (RAD) sessions to converge early toward a design acceptable to the customer and feasible for the developers and to limit a project’s exposure to the forces of change.
  • Gathered and documented MDMapplication, conversion and integration requirements.
  • Worked in Agile Methodologies for the implementation of the project.
  • Experience in coding SQL/PL SQL using Procedures, Triggers and Packages.
  • Good understanding of Relational Database Design, Data Warehouse/OLAP concepts and methodologies.
  • Worked on different platforms such as Windows 95/windows/98/NT and UNIX, Sun Solaris, AIX, HP
  • Experience in Working with different industries like Financial, Media, Retail and Banking.
  • Experienced in CCARRegulatory Reportingand CCARStress Testing for Banking Balance Sheet Asset management
  • Experience with MDMHub configurations - Data modeling & Data Mappings, Data validation, cleansing, Match and Merge rules.
  • Created Functional Decomposition diagram to represent the data managementactivities.
  • Insight over the design and development of Tableau visualizations which includes Preparing Dashboards using calculations, parameters, calculated fields, groups, sets and hierarchies.
  • Good knowledge on Developing, Designing and supporting interactiveTableauDashboard reports.
  • Experience in creating functional/technical specifications, data design documents based on the requirements.
  • Responsible for qualitycontrol activities including Test Case management, designing Test execution suite and defect cycle management, test case designing and execution
  • Creation of a Text Analysis Library in Rtool and python having multiple generic as well as customized applications
  • Have good understanding on object data model of SalesCloud related objects such as Opportunities, Quotes, QuoteLineItems, Products and Pricebooks.
  • Experienced in analyzing data using Hive QL and Pig Latin and custom Map Reduce programs in Java.

TECHNICAL SKILLS

Languages: Postgres SQL, T-SQL, Pl/SQL, R, Python

Data Visualization: Tableau

Data Analysis: SAS, Rtool, Google and Adobe Analytics

Modeling Technique: Predictive Modeling/ANOVAs/Linear Regression, Logistics Regression/Cluster analysis

Web Technologies: HTML, DHTML, JAVA Script, XML

Databases: MS SQL Server (2012/2008, 2005/ 2000/ 7.0 ), MS-Access, Oracle 10g

Reporting: SQL Server Reporting Services (SSRS), Tableau, CCAR

Process/Model Tools-: MS Office, MS Project,Visio, SharePoint 2013, Rational Rose,RequisiteProClearcase, Clearquest,MS Excel,MS Power Point,MS Word

PROFESSIONAL EXPERIENCE

Confidential

Sr. Data Engineer

Responsibilities:

  • Successfully finished the MB Integration project to merge into Fifth Third EDW for feeding both the sources into solutions and ensuring existing EDO data assets (EDW, CCRA, FMP, Data Lake and PDA) maintain stability and data quality after the integration of new data
  • Played a vital role in developing EDO strategies for a project which is part of multi-phased approach focused on re-engineering Fifth Third’s Consumer and Small Business credit card infrastructure, optimizing the revenue and delivery costs by 19% and thereby improving the customer servicing, and position the bank for growth
  • Built an assignment model to leverage the new FNA ownership model for marketing and customer engagement activities, so we can view FNA (customer) to banker and FNA (customer) to branch
  • Finished the consumer revenue redesign, allowing reconciliation of customer-account level revenue to general ledger from all the source systems
  • Analyzed large data sets to provide strategical direction to the Bank. Worked with various segments of banking (Loans, Mortgages, Debit Cards, Trust, Credit Cards, Deposits, Insurance, Plans, Marketing, etc.) to generate the GL files and load it into EDW further creating PL-KPI groupings
  • Created Python scripts to generate reports, insightful dashboards, market analysis and provided predictive modeling support to GTM team focused on improving performance of personalization campaigns
  • Collaborated with business process owners to define requirements, design and present solutions to meet the requirements accurately. Provided strategical recommendations based on analysis of the data, business situation, and moderate knowledge of systems.
  • Conducted cost benefit analysis and identified profitable advertising mediums to target customers. Used clusters to build predictive models relying on Random Forest, logistic regression and classification trees to forecast lead conversion and perceive factors that have direct impact on the Bank’s marketing efforts.
  • Extract, transform, load, process, evaluate and maintain integrity of client-provided data.
  • Created SQL jobs to auto generate Daily Schedule Reports, Monthly Run and Quarterly Run of Reports according to the requirements defined. Involved in writing complex queries, building packages and reports to support different applications.
  • Creation of Requirements and Functional Design Documents detailing how the data related to assigned projects should be transformed for the Data Warehouse, stored within the Data Warehouse, and transformed for the downstream applications.
  • Create and/or maintain high-level reporting systems using SQL or other ETL tools. Troubleshoot and support various MS Access/SQL databases developed and utilized by the team to ensure optimal performance and data reliability at all times.
  • Documented various Data Quality mapping document, audit and security compliance adherence.
  • Developed, tested, and executed programs to harvest the required metadata from an environment remote from Collibra Data Governance Center.
  • Played a key role in conducting user acceptance testing with Line of Business, Stakeholders and downstream users by acting as a middle layer between the Technical team and the Business users.
  • Perform Data Analysis inclusive but not limited through the utilization of SQL, DataStage, Mainframes, BlueZone Secure FTP, BO Universe etc.,
  • Perform some aspects of project management including but not limited to approval gathering, readiness assessments, and other artifacts and controls.
  • Participate and present ideas for system improvements in Business Working team reviews with stakeholders/LOB of the project and discuss improvements.

Environment: Cardett AQT, IBM InfoSphere DataStage, Python, IBM InfoSphere MDM, IBM DB2DBCOPY1, BlueZone Mainframe, Business Objects XI 3.1, Blue Zone Secure FTP, PUTTY, Oracle Data Mining, SAP Business Intelligence, SAS, ERWIN r9.7, Adobe Analytics, Microsoft Suite, ER/Studio, Collibra, InfoSphere Information Governance Catalog, SharePoint, Application Lifecycle Management (ALM) Quality Center, BMC Remedy IT Service Management, MDM Analytics.

Confidential,TX

Sr. Data Analyst/ Data Modeler Plano

Responsibilities:

  • Created the eRestaurant model by analyzing the existing BOH structure and built solutions for various systems integrating into CloudBridge infrastructure as part of eRestaurant Implementation
  • Built churn model to predict likelihood of the sales during non-holiday/non-game days. These scores are used in designing inventory, employees payrate, resulted in reducing employees’ churn rate by 30%
  • Created a weekly review dashboard that allowed viewing important metrics to access store sales and their weekly performances
  • Developed strategy for personalized reward campaign, generated net sales lift of $98K and increased customer base by 21%
  • Work closely with MDMInformatica Architects to fully understand and “read” ETL processes and process flows for business requirements; understand the business rules that are coded within the ETL load processes.
  • Coordinated meetings with vendors to define requirements and system interaction agreement documentation between client and vendor system.
  • Worked with project team representatives to ensure that logical and physical ER/Studio data models were developed in line with corporate standards and guidelines
  • Configured MDM tables in the following sequence: Base Objects, Relationships Base Objects, and Defined and developed the ETL process to load the data into the landing tables during the land process
  • Created SQL server Analysis Projects and created Multi-dimensional databases and OLAP Cubes and dimensions for the analysis of sales in specific areas using SSAS.
  • Experience with Vertica as the target for the data marts (tables).
  • Worked with Postgres SQL for creating sample data and created process flow diagrams using Visio.
  • Document various Data Quality mapping document, audit and security compliance adherence.
  • Work with the business and the ETL developers in the analysis and resolution of data related problem tickets.
  • Experience in Microstrategy for pulling out the reports during mapping analysis and also assisted in building reports for inventory and sales/polling.
  • Deployed analysis projects using SSAS based on data volume, as well as developed of KPI’s which helped the Business users for easy understanding.
  • Implemented complex discounting structures using the SalesCloud module in SalesForce.com.
  • Configured product in bundle pricings with tier support discounts for various customers such as distributors, resellers.
  • Experience in coding SQL/PL SQLusing Procedures, Triggers and Packages.
  • Extensive experience in HP ALMadministration module and performed QA duties.
  • Involved in data quality - testing the code generated by the developers and executed/evaluated manual or automated test cases and reported test results.

Environment: Erwin r9, PostgreSQL, Server 2008, UNIX AIX version 6, HP Vertica Analytics Platform 6.1.x, MicroStrategy desktop V 9.3.1, OBIEE, MS Excel, Google Analytics, MS Visio 2013, TOAD, SQL, PL/SQL, Rally, QuickBase, Talend, Windows, full copy sandbox, Salesforce Schema Builder, Data Loader, Salesforce Workbench.

Confidential, LA

Sr. Data Analyst

Responsibilities:

  • Involved in defining the business/transformation rules applied for sales and service data.
  • Define the list codes and code conversions between the source systems and the data mart.
  • Worked with internal architects and, assisting in the development of current and target state data architectures
  • Worked with project team representatives to ensure that logical and physical ER/Studio data models were developed in line with corporate standards and guidelines
  • Involved in defining the source to target data mappings, business rules, business and data definitions
  • Responsible for defining the key identifiers for each mapping/interface
  • Responsible for defining the functional requirement documents for each source to target interface.
  • Coordinated meetings with vendors to define requirements and system interaction agreement documentation between client and vendor system.
  • Experience with Teradata as the target for the data marts, worked with BTEQ, Fast Load and MultiLoad.
  • Evaluated data profiling, cleansing, integration and extraction tools (e.g. Informatica)
  • Consolidatingdatafrom multiple sources in support of campus wide decision making and related information needs such as reporting, analysis, and planning (EDW).
  • ProducedPL/SQL statement and stored procedures inDB2for extracting as well as writingdata
  • Developed PL/SQL stored procedures for the end-user report requirements
  • Worked with Postgres SQL for creating sample data and created process flow diagrams using Visio.
  • Responsible for defining the key identifiers for each mapping/interface
  • Implementation of Metadata Repository, Maintaining Data Quality, Data Cleanup procedures, Transformations, Data Standards, Data Governance program, Scripts, Stored Procedures, triggers and execution of test plans
  • Creation of a Text Analysis Library in Rtool and Pythonhaving multiple generic as well as customized applications
  • The data was cleaned and run through Rtoolto analyze and predict the change in batch inflow for each customer and itseffect on environment.
  • In order to see accurate information performed data quality analysis and validation, conduct data mining to identify trends and solutions to improve qualityof Clinicaldata. Collect, maintain, manipulate, query, research, scrub and ensure dataaccuracy and integrity.
  • Configured product in bundle pricings with tier support discounts for various customers such as distributors, resellers and end users using CPQ in SalesCloud.
  • Performed load and stress testing to benchmark the performance of the service

    Technologies or Technical tools used: Rtool, Rserve, java, python, docker containers

Environment: SQL/Server, PL/SQL, Postgres SQL, Oracle 9i, MS-Office, Teradata, Informatica, ER Studio, XML, Business Objects, Rtool, Python, UNIX, Adobe Analytics, full copy sandbox, Salesforce Schema Builder, Data Loader, Salesforce Workbench.

Confidential, LA

Sr. Data Analyst

Responsibilities:

  • ProvidedTreasury and Cash Management support to VP of Finance, Assistant Treasurer and Accounts Payable Department
  • Work with the Project Management in the creation of project estimates.
  • Provide regulatory compliance risk expertise and consulting for projects to identify and mitigateregulatory compliance risk
  • Analysis of the data identifying source of data and data mappings of HCFG.
  • Worked extensively in documenting the Source to Target Mapping documents with data transformation logic.
  • Interact with the SME’s to analyze the data extracts from Legacy Systems (Mainframes and COBOL Files) and determine the element source, format and its integrity within the system
  • Transformation of requirements into data structures which can be used to efficiently store, manipulate and retrieve information
  • Collaborate with data modelers, ETL developers in the creating the Data Functional Design documents.
  • Ensure that models conform to established best practices (including normalization rules) and accommodate change in a cost-effective and timely manner.
  • Work with the business and the ETL developers in the analysis and resolution of data related problem tickets.
  • Support development teams creating applications against supported databases.
  • Perform small enhancements (SOR element additions, data cleansing/data quality).
  • Create various Data Mapping Repository documents as part of Metadata services (EMR).
  • Provide inputs to development team in performing extraction, transformation and load for data marts and data warehouses.
  • Developed large number of standard and customized reports usingSSRSfor data feeds and for Reports on server.
  • Involved in Creating and Testing theSSISpackages developed for security.
  • ModifiedPL/SQLscripts to validate and loaddatainto interface tables. The backend was inOracleand database operations were handled using storedprocedures
  • Developed PL/SQL stored procedures for the end-user report requirements
  • Actively involved in developing ComplexSSRSReports involving Sub Reports, Matrix/Tabular Reports, Charts and Graphs
  • ProducedPL/SQL statement and stored procedures inDB2for extracting as well as writingdata
  • Ensure that models conform to established best practices (including normalization rules) and accommodate change in a cost-effective and timely manner.
  • UsingSharePoint, created a new site and inserted an Excel Dashboard forSharePoint users. Added and configuredSharePoint lists, sites, and created custom forms
  • ConfiguredSharePoint Site Administrating SettingsSupport development teams creating applications against supported databases.
  • Document various Data Quality mapping document, audit and security compliance adherence.
  • Perform small enhancements (SOR element additions, data cleansing/data quality).
  • Create various Data Mapping Repository documents as part of Metadata services (EMR).
  • Strong background in CCARRegulatory Reportingespecially in FRY 9C, FRY 14Q, FRY 14A
  • Provide inputs to development team in performing extraction, transformation and load for data marts and data warehouses.
  • Provide support in developing and maintaining ETL processes that extract data from multiple SOR’s residing on various technology platforms then transport the data to various delivery points such as data marts or data warehouses.

Environment: SQL Server 2005/2008 Enterprise Edition, CCAR, FR-Y 9C, FR-Y 14Q, PPNR, FR-Y 11, FR-Y 9LP, FFIEC 031, FFIEC 041, FFIEC D09, SSIS 2005, MS Excel, MS Access, SharePoint 2013, Oracle 10g, UNIX, Windows XP, SQL, PL/SQL, Power Designer, Informatica

Confidential, Minneapolis, MN

Sr. Data Analyst

Responsibilities:

  • Document the complete process flow to describe program development, logic, testing, and implementation, application integration, coding.
  • Involved in defining the business/transformation rules applied for sales and service data.
  • Define the list codes and code conversions between the source systems and the data mart.
  • Consolidatingdatafrom multiple sources in support of campus wide decision making and related information needs such as reporting, analysis, and planning (EDW).
  • Worked with project team representatives to ensure that logical and physical ER/Studio data models were developed in line with corporate standards and guidelines
  • Experience with Teradata as the target for the data marts, worked with BTEQ, Fast Load and MultiLoad.
  • Involved in defining the source to target data mappings, business rules, business and data definitions
  • Worked with BTEQ to submitSQLstatements, import and export data, and generate reports in Terra-data.
  • Collaborate with data modelers, ETL developers in the creating the Data Functional Design documents.
  • Responsible for defining the functional requirement documents for each source to target interface.
  • Document, clarify, and communicate requests for change requests with the requestor and coordinate with the development and testing team.
  • Coordinated meetings with vendors to define requirements and system interaction agreement documentation between client and vendor system.
  • Work with the business and the ETL developers in the analysis and resolution of data related problem tickets.
  • Enterprise Metadata Library with any changes or updates
  • Document data quality and traceability documents for each source interface
  • Establish standards of procedures.
  • Generate weekly and monthly asset inventory reports.
  • Evaluated data profiling, cleansing, integration and extraction tools (e.g. Informatica)
  • Coordinate with the business users in providing appropriate, effective and efficient way to design the new reporting needs based on the user with the existing functionality
  • Remain knowledgeable in all areas of business operations in order to identify systems needs and requirements.
  • Responsible for defining the key identifiers for each mapping/interface
  • Implementation of Metadata Repository, Maintaining Data Quality, Data Cleanup procedures, Transformations, Data Standards, Data Governance program, Scripts, Stored Procedures, triggers and execution of test plans
  • Performed data quality in Talend Open Studio

Environment: SQL/Server, Oracle 9i, MS-Office, Teradata, Informatica, ER Studio, XML, Business Objects

Confidential

Responsibilities:

  • Plan, design, and implement application database code objects, such as stored procedures and views.
  • Build and maintain SQL scripts, indexes, and complex queries for data analysis and extraction.
  • Created ad-hoc reports for the upper level management using Stored Procedures and MS SQL Server 2005 Reporting Services (SSRS) following the business requirements.
  • Created reports by dragging data from cube and wrote mdx scripts.
  • Created reports by extracting data from cube.
  • Generated reports using SQL Server Reporting Services 2005/2008 from OLTP and OLAP data sources.
  • Provide database coding to support business applications using Sybase T-SQL.
  • Perform quality assurance and testing of SQL server environment.
  • Develop new processes to facilitate import and normalization, including data file for counterparties.
  • Work with business stakeholders, application developers, and production teams and across functional units to identify business needs and discuss solution options.
  • Ensure best practices are applied and integrity of data is maintained through security, documentation, and change management.

Environment: SQL Server 2005 Enterprise Edition, T-SQL, Enterprise manager, VBS.

We'd love your feedback!