We provide IT Staff Augmentation Services!

Data Governance Engineer (data Modeler) Resume

3.00/5 (Submit Your Rating)

Charlotte, NC

SUMMARY:

  • Lead efforts to implement Data Warehousing and Data Migration/Integration; includes System analysis, Technical Design, development and deployment end - to-end solution in Enterprise model software.
  • Hosting scrum calls within the teams and following the agile methodology for projects using the SCRUM concept.
  • Architecture experience on ETL Process/tools such as Informatica and also the reporting tools such as OBIEE in creating reports that can be used by the executive staff
  • Efficient in Data Modeling Logical/Physical, Dimensional and Star/Snowflake schema designing for both OLTP/OLAP and DSS systems. Re-factor existing data models & databases for performance and scalability using SQL Data modeler, ERWIN, and ER Studio
  • Experience developing and designing technical documentation, functional and non-functional requirements, publishing the release notes which are shared across the organization to know what is delivered in that specific release.
  • Hands on experience in applying the reverse engineering, forward engineering, complete compare techniques using ERWIN and ensuring the data modeling concepts are enforced in the process.
  • Ownership for ongoing projects and day to day activities pertaining to Data modeling, Data cleansing, Data warehouse and Reporting across multiple stakeholders includes liaison with SME/Business users, recruited new team members, trained, coached as mentor.
  • Adopting efficient measures on security, scalability, reliability while creating the data models, message models and canonical models.
  • Concepts of resource modeling involving the adoption of Restful web service techniques, and how they can be utilized in the APIs.
  • Experience working on the projects involving agile techniques, using the scrum concept working the tasks in iterations, hosting the scrum calls and ensuring the smooth transition of overall process.
  • Work closely with Architects and product managers to understand the business requirements and help design, develop and implement solutions and Integration that meet the business needs.
  • Experience in using Version Control (Tortoise SVN, GIT), Quality control, salesforce management, Jira process. Timely tracking and updating the issues to be on the same page with stakeholders/customers.
  • Experience in data scrubbing, data profiling, data masking using Trillium and TOAD Data Point.
  • Experience in working with the XML data as per the ACORD standards and created XSD for Personal Lines of Insurance

TECHNICAL SKILLS:

Operating Systems: Windows NT/2000/XP, UNIX/LINUX

Languages: C, JAVA, SQL, ASP.Net with Visual studios, PHP, Coldfusion

Databases/RDBMS: Oracle 12c/11g/10g/9i/8.0, Tera data, Micrsoft SQL server, Informix

Tools: Visual Paradigm, Erwin, SQL data modeler, OBIEE, Splunk, Informatica, STS, XML spy, Jira, Salesforce

Architecture: SOA, Enterprise software architecture, Restful concept, AWS web services

OLAP/ Reporting: Dimensional Modeling, Star Schema, Snowflake Schema

Tableau: Tableau 9.7.2, Tableau Reader, Tableau Public

Packages: Microsoft office Suite, Microsoft Project 2010, SAP and Microsoft Visio.

Other Tools: SAS Enterprise Guide, SAP ECC and Panorama Web Service, CodeFlow/ TFS.

PROFESSIONAL EXPERIENCE:

Data Governance Engineer (Data Modeler)

Confidential, Charlotte, NC

Responsibilities:

  • Worked as a Data Governance Engineer/Data Modeler with a global team of architects, and solution designers through high and low level design requirements, data modeling and providing support to other leads, developers and testers for analysis and review portion of financial risk management product and recommend solutions for different cross functional teams in the organization (resolving issues reported through Jira Portal).
  • Adoption of Industry standards of which support all the financial players out there in the market, therefore with an intention to provide the enterprise solution software. In rare cases based on the requirement from customer (Banks/financial institutions) building product specific models to support the immediate need.
  • Worked on to support/integrate the BIAN (Banking Industry Architecture Network, Germany based Company) architecture into ACI and adopt the naming conventions used for the service scope and operations which in turn can be used handy for different customers.
  • Perform a thorough gap analysis, data modeling, data discovery & data mapping tasks on complicated data sets with potentially complex data integration scenarios with limited supervision.
  • For every release (quarterly) I am involved in applying the concepts of data profiling and data quality management to estimate the time required to complete a task Confidential a granular level. This is always a critical or challenging task as to come up with a calculation (applying concepts of quality, quantity, concept, time) to be followed for the release.
  • Engage with the solution architects, developers of the different products in the event when they want to integrate their product functionality into the central utility of enterprise software. This involve many tasks such as preparing high level design document(HLD), ensuring those are enforced in the data modeling stage and finally sign off the document from both the parties.
  • Perform regularly the local builds (code through local computer) and the server builds (Jenkins Server) and sure they go clean before committing the check in to the server. Using STS and xml spy to debug the errors generated in the build process
  • Create and document the processes on forward engineering, reverse engineering, and complete compare as required. Ensuring the data integrity principles (Entity Integrity, Domain Integrity and Referential Integrity) are followed as necessary.
  • Supporting the monitoring of Jira portal for the issues raised by internal and external customers, assigning them to the right people as required, resolving few issues related to data model architecture, storing the SQL queries created to generate the set of issued based on the Status
  • Creating canonical models which act as a schema or a reference while creating any message models for customers.
  • Participating in the review session for the created canonical models with the peers, Architects in the team through web ex on a weekly basis.
  • Creation and review sessions for the resource modeling as future concept to be applied in the form of open banking APIs. So, brainstormed the concepts of restful web services to be applied in the resource modeling.

Data Analyst/Data Modeler

Confidential, Atlanta, GA

Responsibilities:

  • Collaborate with data architects for data model management and version control.
  • Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Conduct data model reviews with project team members.
  • Capture technical metadata through data modeling tools. Create data objects (DDL).
  • Enforce standards and best practices around data modeling efforts. Applied data cleansing/data scrubbing techniques to ensure consistency amongst data sets.
  • Ensure data warehouse and data mart designs efficiently support BI and end user
  • Involved in various projects related to Data Modeling, System/Data Analysis, Design and Development for both OLTP and Data warehousing environments.
  • Developed mapping spreadsheets for (ETL) team with source to target data mapping with physical naming standard data types, volumetric, domain definitions, and corporate meta-data definitions
  • Applied Data Governance Best Practices Confidential UNUM, insurance company to achieve Data Governance Business, Functional and IT goals.
  • Ensure Compliance, Increase Customer Satisfaction, Support Business Integration, Increase Data Quality, Support IT Integration
  • Create Data Strategy, Establish and Implement DQ control and standards and metadata management.
  • DQ is measured on continuous basis. Establish data life-cycle and data architecture management.
  • Enables a single view of each Master Data class.
  • Involved in the design of the application, creating database tables, integrating the APIs created in the database Informix.
  • Applying the data warehouse concepts, normalization principles in the design and development of start/snowflake schema data modeling.
  • Coordinating with the peers on the start and end points relating to the APIs that connect ultimately Confidential the end of life cycle for a process.
  • Working on the architecture diagram which gives high level design on how each API is called and how it’s connected to the others in the event of the entire process.

Data Analyst

Confidential

Responsibilities:

  • Design, develop and maintain business metrics reports for multiple departments using ColdFusion as the scripting language, SQL for retrieval of information from database, Teradata sql assistant for storage.
  • Ensure adoption of database Modeling Policy, Strategies, Standards, Guidelines and Best Practices for large and strategic application systems in Transportation domain (Rail Roads).
  • Enforce Data Architecture, Data quality and Data Conversion/ migration strategies.
  • Analyze business processes and deliver analytical reports for financial department, generates reports and dashboards using Oracle Business Intelligence Enterprise Edition (OBIEE), Creating and modifying KPIs to showcase business statistics to be used by the executive management.
  • Migration of data from Oracle to Teradata using ETL informatics Power center, conduct performance tuning for ETL jobs and run ETL informatica scripts.
  • Work on to ensure the smooth transition of work between Test, Development and Production environments, debug issues as and when arise.
  • Work on the Dimensional modeling for the Shipment data (term specific to Rail Road terminology) that used to get generated every day, apply efficient principles to categorize and format the data.
  • Refining existing data and data usage patterns to build new physical data model and refining existing data structure; Converting business logic from application layer to database layer using concepts of data modeling.
  • Coordinating with finance, marketing departments, SME groups to gather requirement and recommend them the solutions, provide efficient reports, mentor them in using the report efficiently as per there need.

We'd love your feedback!