We provide IT Staff Augmentation Services!

Senior Data Engineer Resume

San Ramon, CA

SUMMARY:

  • I am seeking to advance my career as a techno - functional Data Engineer and work for a company where I can unleash my imagination and skills to implement in a corporation that encourages freedom of thought and welcome new ideas. I have learned to work in a super-agile and confidential domain. I would like to apply my experience and knowledge to work in a productive environment.
  • 7 years of in-depth experience as Business Intelligence professional with multi-national companies around the world.
  • Extensive experience with Master Data Management (MDM), Data Extraction, Transformation, and Loading (ETL) and using Informatica, Sql Server Integration Services (SSIS).
  • Excellent Communication (Verbal and Written), Presentation and Interpersonal Skills
  • Extensive knowledge and experience in Business Intelligence and data warehousing concepts.
  • In depth experience in Data Analysis, Data Mapping, and Data extraction by writing complex Business queries for large databases and for ETL process.
  • Hands on experience with Business Data Modeling on MDM platforms like Informatica MDM.
  • Hands on experience in creating the MDX cubes and dimensions with tools like SSAS.
  • Experience in Designing, Creating, Deploying and Processing of Cubes using SSAS.
  • Performed intensive SQL data Analysis and Visualizations for Healthcare and Banking Data to provide detailed visualizations using SSIS or Tableau as per Client Requirement.
  • Experience in ETL testing of SSIS/ Informatica package running at the backend using Informatica and performing SQL based visualizations of Reports/Dashboards for optimized performance.
  • Expert in Data Integrity, Performance Tuning and Query Optimization.
  • Experience in Requirements Gathering, preparing the Software Requirement Specifications (SRS) Document.
  • Excellent analytical and problem solving skills with ability to understand and analyze complex Issues and problems.
  • Good understanding of Technical trends, architectures, self-learner and motivated to stay on top of latest technology, new software and products.
  • Knowledge on the different phases of Agile methodology and Scrum Methodology.
  • Used TFS (Team Foundation Server) and JIRA for project management and source control.
  • Experience working with CRM technologies allowing effective communications with Customers
  • Experience in Software Quality Assurance testing life cycle & experience in Functional Testing and Coordinating the testing efforts for: QAT, SIT, UAT, Regression, Smoke, Integration, Migration, Performance, Stress and Data Driven
  • Experience in testing of SSIS/Informatica package running at the backend using Informatica DVO and performing SQL based testing of Reports/Dashboards for optimized performance.
  • Experience in developing test plans, test strategy and test cases for MDM/PIM platforms.

TECHNICAL SKILLS:

Methodologies: Agile/Scrum, Water-Scrum methodology

Master Data Management Systems: Informatica MDM, Oracle MDM, Product Information Management (PIM)

Databases: Oracle 12g, IBM DB2

SDLC Methodologies: Waterfall, Agile, Spiral, Iterative

ETL Tools: SQL Server Integration Services(SSIS), Informatica, TALEND

Analysis Tool: SQL Server Analysis Services(SSAS)

BI/Data Report Tools: SQL Server Reporting Services(SSRS),Tableau

Test Management Tools: HP Quick Test Professional, HP Quality Center, TFS, Informatica DVO, Query Surge, Bugzilla, Jira

Operating Systems: Mac iOS, Windows, Linux

Documentation Tools: MS Office, MS Excel

PROFESSIONAL EXPERIENCE:

Confidential, San Ramon, CA

Senior Data Engineer

Responsibilities:

  • Executes test, data extraction using ETL tools such as SSIS (SQL Server Integration Services) and Talend and analysis utilizing the department's various computer software packages within established deadlines.
  • Prepares, and documents detailed specifications of the programs utilized to pull data. Includes: creation and use of documents to facilitate the request and data integrity of the data that is provided to the internal customer.
  • Analyze and make recommends on business, technology, and process improvement opportunities.
  • Partners with various business lines to perform data analysis procedures as applicable to the support of ongoing projects.
  • Generates ad hoc reports using SSRS (SQL Server Reporting Services) and Tableau and regular datasets or report information for end-users using system tools and database or data warehouse queries.
  • Work with senior management, technical and client teams in order to determine data requirements, business data implementation approaches, and best practices for advanced data manipulation, storage and analysis strategies.
  • Perform Data Quality Analyses on MDM Data.
  • Helping Business SME to understand MDM Data Model by answering ‘Why’ and ‘How’ of Data.
  • Write and code logical and physical database descriptions and specify identifiers of database to management system or direct others in coding descriptions.
  • Modify existing databases and database management systems and/or direct programmers and analysts to make changes.
  • Test programs or databases, correct errors and make necessary modifications.

Confidential, San Francisco, CA

Data Engineer

Responsibilities:

  • Assist in product design reviews to provide input on data functional requirements, product designs, schedules, or potential problems.
  • Analyses, design ETL (Extract, Transform and Load) automation scripts to test data flow and report results using SQL Server Reporting Services (SSRS).
  • Develop ETL Scripts using Informatica that address areas such as database migration, regression testing, negative testing, error or bug retests, or usability.
  • Validate Informatica mappings, sessions and workflows for migrating data from multiple systems to target systems.
  • Representing data using the Quality Control metrics provide summary statistics through visualizations such as Charts in SSRS or Tableau.
  • Quality check data conversion of ETL mappings in Informatica mapping which requires knowledge of both the toolset.
  • Document data validation defects, using TFS and Jira as bug tracking system.
  • Perform initial debugging procedures by reviewing configuration files, logs, or code pieces to determine breakdown source.
  • Update automated test scripts to ensure efficiency and correctness of data quality checks.
  • Identify program deviance from standards, and suggest modifications to ensure compliance.
  • Plan test schedules or strategies in accordance with project scope or delivery dates.
  • Hold defect review meetings with the business and development teams for resolving issues and bugs. Prioritize the Bugs reported and re-test.
  • Participating in execution and documentation of tests necessary to ensure that an application or technical environment meets performance requirements.
  • Worked on Validation of Match/Merge Rules to check the effectiveness of MDM process on data.
  • Performed Analysis on Address Doctor Results, which can cleanse whole world address Data and enhanced by making some modifications during QA phase.
  • Designing Data quality and optimizing solutions using SQL, SSIS and MDM strategies in Oracle.
  • Worked on data cleansing and standardization using the Cleanse functions in Informatica MDM and MD5 Checksum Function in Informatica to validate Data Integrity.
  • Adept in MDM processes: Data cleaning and extractions, Data governance and Data profiling.
  • Execute Automation Test Suite based on the requirement.

Confidential, Charlotte, North Carolina

Data Engineer

Responsibilities:

  • Analyze ETL QA requirements and prepare a test Strategy Document to document QA scope of Requirements and get approval from Stakeholders.
  • Validate Informatica mappings, sessions and workflows for migrating data from multiple systems to target systems.
  • Creating MDM landing tables Staging tables Base Tables as per the data model and data sources.
  • Developing mappings using various cleanse functions and Address doctor functions to move data into Stage tables.
  • Working with Power Center Designer Repository Manager Workflow Manager and Workflow Monitor.
  • Update automated test scripts using Informatica to ensure efficiency and correctness of data quality checks.
  • Extensively implemented Slowly Changing Dimensions Type1 Type2 Type3.
  • Experience in Extraction Transformation and Loading ETL data from various sources into Data Warehouses using Informatica Power Center Power Mart Repository Manager Designer Workflow Manager and Workflow Monitor.
  • Worked extensively on Informatica client tools like Designer workflow Manger Workflow Monitor during different phase of project.
  • Created Sessions/Batches in Informatica server manager to execute mappings
  • Document data validation defects, using TFS as bug tracking system.
  • Perform initial debugging procedures by reviewing configuration files, logs, or code pieces to determine breakdown source.
  • All facets of MDM implementations including Data Profiling metadata acquisition data migration validation reject processing and pre landing processing.
  • Expertise in design and configuration of MDM landing tables, staging tables, base objects, hierarchies, foreign-key relationships, lookups, query groups, queries/custom queries and packages
  • Performing requirement gathering analysis design development testing implementation support and maintenance phases of both MDM and Data Integration Projects.
  • Hold defect review meetings with the business and development teams for resolving issues and bugs. Prioritize the Bugs reported and re-test.
  • Validate implementation of data masking and data cleansing in Informatica Data Quality (IDQ) Tool.
  • Define Programming Requirement: Formulate a detailed plan, outline steps to develop program using structured Analysis and Design.
  • Convert Project Specification using Flowcharts and Diagrams into sequence of detailed instructions.
  • Create Test Cases, Test Plans, Conduct and Evaluate Unit Testing to verify correct implementation of Program/Module.
  • Performed match/merge and ran match rules to check the effectiveness of MDM process on data.
  • Create User Training, Reference Manuals.
  • Worked on Validation of Match/Merge Rules to check the effectiveness of MDM process on data.
  • Performed Analysis on Address Doctor Results, which can cleanse whole world address Data and enhanced by making some modifications during QA phase.
  • Designing Data quality and optimizing solutions using SQL, SSIS and MDM strategies in Oracle

Confidential

Lead Engineer

Responsibilities:

  • Developing ETL (EXTRACT TRANSFORM & LOAD) Package using SSIS (SQL SERVER INTEGRATION SERVICES).
  • Gather Requirements by having Scrum calls with Business Team.
  • Analyzing the Data Loaded into the Database using SSAS (SQL SERVER ANALYSIS SERVICES).
  • Designing Reports and Dashboards using SSRS (SQL SERVER REPORTING SERVICES) for Pharmaceutical companies.
  • Automation of ETL packages using Scripting Language.
  • Experience working on XML Scripting to design ETL Packages in BIML Tool.
  • Worked on Product Information management Databases for storing and maintaining Data for Pharmaceutical Companies.
  • Daily Scrum Calls for status update with Business Team.
  • Performing Unit Testing for Reports/Dashboards with Test Plan, Test Cases.
  • Creating Informatica work flows for data transfer for various Source Systems and taking care of Full and Delta load for data transfer from Landing to Base Object.
  • Worked on Validation of Match/Merge Rules to check the effectiveness of MDM process on data

Confidential

Database Developer

Responsibilities:

  • Transferring data directly to front end reporting platform using SSIS automation.
  • Participated in requirements meetings and data mapping sessions to understand business needs.
  • Identified and documented detailed business rules and use cases based on requirements analysis.
  • Engineered and implemented automation of SSIS package to load data directly to Good Data workspace that lessened overall time of Extract Transform Load step in business solution.
  • Created backup and checkpoints of daily incremental load by using scripting language in SSIS Package.
  • Developed new or maintained existing Informatica mappings and workflows based on Business Requirement.
  • Document specifications or on changes suggested by end user.
  • Provided assistance in online campaigns and online marketing strategy execution using Google Analytics
  • Prepared and executed test cases, and participated in black box testing.
  • Developed and executed SQL queries to verify the proper insertion, deletion and updates into
  • Working on Informatica ETL automation to transfer social data for trends based on sales force data.

Confidential

Systems Engineer

Responsibilities:

  • Analyze user needs and software requirements to determine feasibility of design within time and cost constraints.
  • Analyze user needs and software requirements to determine feasibility of design within time and cost constraints.
  • Designed Reports objects such as attributes, measures and other BI solution components as per technical specification documents.
  • Analyzed and worked on to provide optimum BI solution to the user by having daily standup calls with the customer
  • To load data from source database to target database through use of ETL Tool SSIS.
  • Analyzed the data loaded into an intermediate database through SSAS tool of Confidential .
  • Used Management Studio 2008 to write SQL queries against the database for data validation processes.
  • Created and modified tables, views and indexes; SQL stored procedures, functions and triggers according to business requirement.
  • Involved in executing SQL queries for back-end testing.
  • Developed Reports/Dashboards on the basis of cubes and data present in target database loaded using SSRS tool.

Hire Now