We provide IT Staff Augmentation Services!

Etl Engineer Resume

5.00/5 (Submit Your Rating)

Omaha, NE

SUMMARY

  • IT professional with 7+ years of BI/Data Warehousing Development, reporting, Application Integration experience including SSIS, SSRS and SQL Programming, Tableau and Mule4.
  • Extensive Experience in the Analysis, Design, Development, Implementing testing of software applications.
  • Strong knowledge of SQL Server BI (SSIS, SSAS, SSRS) Data Warehousing/Data Marts concepts and Dimensional Modeling.
  • Strong experience working in Software Development Life Cycle using various methodologies including Agile, Waterfall, and other client specific internal methodologies.
  • Experience in using Oracle 11i/10g, Microsoft SQL Server 2005/20, Informatics Power center 9.5.1, 10.
  • Experience in working with different source databases like Oracle, MS SQL Server 2008R, 2012, 2014, Main Frame data, XML files, Flat Files and MS Access.
  • Demonstrated expertise utilizing ETL tool, ETL package design, and RDBMS systems like SQL Server, Oracle, and Teradata.
  • Extensively used SQL for accessing and manipulating database systems such as writing complex queries/dynamic SQL, joins, stored procedures, user - defined function, views and indexes using T-SQL and PL/SQL programming.
  • Experience in Debugging and Performance tuning, Error handling and logging of SSIS Packages,
  • Creating Monitoring Daily Jobs scheduled DTS Packages using VisualCron, SQL Mail Agent.
  • Performance tuning of Complex SQL Queries, SQL Profiling and Data Modeling of tables using Irwin.
  • Experience developing SSRS Report Like Tabular Reports, Matrix Reports, Ad hoc reports, drill down, Parameterized, Cascading, Conditional, Table, Matrix, Chart and Sub Reports using SSRS.
  • Developed Tableau Reports by blending data from different sources.
  • Worked on multiple tableau sheets with actions, filters and bins, parameters and calculated columns.
  • Good understanding with designing and building the dimensions and cubes with star and snowflake schema
  • Ability to adapt to fast changing skills and work and ability to work in-groups as well as independently with minimal supervision.
  • Involved in complete SDLC Life cycle which is agile, including design analysis, development, and testing.
  • Excellent communication, interpersonal, problem solving, and analytical skills.

TECHNICAL SKILLS

Modeling & Designing Tools: MS Visio 2007, Rational Rose 6.0, Irwin

SDLC Methodologies: Waterfall, Agile, Kanban

Programming Languages: C, SharePoint, Salesforce. T-SQL, Teradata SQL, SOQL, SOSL, Apex, VF, data Weave.

Operating Systems: Windows 2000/XP

Tools: and IDE: Microsoft Visual studio 2005/2008/2012, ERWIN Model, MDS MS Office, SSMA, SQL Loader, SQL PLUS, SQL Server Agent, SQL Assistant, SQL Profiler, TFS, SVN, Visual Corn job Scheduler, Tableau and MuleSoft Any point studio

ETL and Reporting tools: TSQL, DDL, DML, TCL, DCL SQL server BIDS (SSIS, SSRS, SSAS), APS, Reporting services (SSRS), Crystal Reports, Visual Force, Sales Force IDE Tableau 8, Tableau Desktop, Informatica Power Center 8.6, Sales Force CRM, Agile.

Database Skills: PDW, SQL Server 2005/2008/2008 R2/2012, MS Access, Oracle 9i/ 10g, Teradata, Epic Clarity and Caboodle databases.

PROFESSIONAL EXPERIENCE

Confidential, Omaha, NE

ETL Engineer

Responsibilities:

  • Experience with Business Intelligence Integration lifecycle and Application Integration.
  • Knowledge of designing and planning ETL solutions, and debugging, monitoring, and troubleshooting ETL solutions.
  • Analyze source data, extract from clarity and caboodle databases, transform and load data into target data warehouse based on requirement specifications using SSIS packages.
  • Work extensively on ETL Process using SSIS and Modify the ETL process to meet organizational reporting needs.
  • Work on deduplication and validation and standardization of emails using external component form Melissa for data quality and exporting data to salesforce objects.
  • Knowledge of creating caboodle DMC on caboodle console and importing non epic data into existing DMC’s.
  • Design and Develop Tableau workbooks and Dashboard using Tableau Desktop.
  • Create Action filters, parameters and calculated fields for preparing dashboards and worksheets in Tableau.
  • Restricted data for particular users using row level Security and user filters.
  • Design and create API specification using RESTful API Modeling Language (RAML) on MuleSoft design center
  • Involve in API design, requirement gathering sessions with application developers and develop RAML documents using Anypoint API platform and provide mocking services to application developers
  • Create Mule application that uses connectors SFDC, HL7 EDI, Message Transformer, and HTTP Connector.
  • Implement Exception Handling Logging, and Error Handling.
  • Build and deploy the mule application to Mulesoft Cloud Hub (Paas).
  • Used Git version control system for code coordination.
  • Develop Rest APIs using Mulesoft Anypoint API Platform
  • Developed Mulesoft application with API led connectivity (SYS-PRC-EXP)
  • Developed Mulesoft applications to transfer the data from EPIC to salesforce using HL7 and Salesforce Composite Connectors and FHIR format to HL7 format for exchange of electronic health record (HER) data
  • Worked on Epic and Interface applications to read the input HL7 Data and transform into json using Data Weave Expression language
  • Modify the server Runtime from 4.2.2 to 4.3.0 in each application. Edit POM.xml files to reflect the new version
  • Upgrade the HTTP, Salesforce, HL7 and json logger connector and other dependencies in each application
  • Utilized Azure Repos and CI/CD tools and Git Hub desktop for during development and deployment to runtime manager.
  • Worked in Agile, Participate in daily scrum meetings to discuss the progress of the project and any backlog of the work items.
  • Supported end user testing and provided post-production support

Environment: Microsoft SQL Server 2014/2012, SSIS, Tableau, Caboodle, Clarity Data Model, Cozyroc, Melissa Data quality components, Visual Studio 2015, Erwin Modeler, SSMS, SSMA, FTP/SFTP, T-SQL, SQL Server Agent, Oracle, Excel, Flat files, TFS, SharePoint, Azure Repos, Git, Git hub desktop, Mule-4, Mule server 4.2.2, 4.3.0, Any point Studio (7.4, 7.5), DataWeave and Salesforce cloud., Cloud Hub, Runtime Manager, Anypoint Studio, Anypoint Platform, Anypoint API designer, Anypoint Exchange, Anypoint API Manager, Anypoint Monitoring (dashboards metrics) and Visualizer (for application networking), Data Weave, HL7, FHIR, Salesforce Composite Connector, RAML, HTTP, REST, XML, JSON, Windows OS

Confidential, Omaha

Database/ BI Developer

Responsibilities:

  • Involved in User Meetings, gathering requirements, discussing, and understanding the functional requirements.
  • Worked extensively on ETL Process using SSIS.
  • Worked on data ware housing Dimension Modeling concepts Star Schema and Snow Flake Schema.
  • Converted Data Transformations Services (DTS) application to SQL Server Integration Services (SSIS) and Cozyroc Tools for Google analytics data.
  • Extensively worked on creating database objects Tables, Views, Stored Procedures, SQL scripts, indexes.
  • Design and development of Framework for SSIS Logging and Error Handling, optimized SSIS Packages and Performance tuning of Packages.
  • Experience writing complex T-SQL queries for data analysis and extraction of user specific datasets.
  • Developed Dynamic SSIS packages across multiple environments for Extracting, Transforming and Loading (ETL) data from DB2, Oracle, Excel, CSV, xml and Flat file to SQL Server SSIS services.
  • Worked on Extracting and loading data using Bulk Import and Export tasks and Bulk Copy Program (BCP).
  • Used ETL to implement Slowly Changing Dimension to maintain historical and incremental change data in Data Warehouse and development.
  • Supported ongoing development and Administration of Daily scheduled jobs and maintenance using VisualCron Scheduling tool.
  • Testing the SSIS packages in DEV and QA and Finally Promote the jobs to production.
  • Involved in extensiveDATA validation by writing several complex SQL queries, back-end testing, and worked with data quality issues.
  • Documented and provided technical documentation of system activities.

Environment: Microsoft SQL Server 2014/2012, SSIS, Tableau, Caboodle, Clarity Data Model, Cozyroc, Visual Studio 2012, Erwin Modeler, SSMS, SSMA, FTP/SFTP, T-SQL, SQL Server Agent, Oracle, Excel, Flat files, TFS, SharePoint.

Confidential, NJ

MSBI Developer (SSIS, SSRS)

Responsibilities:

  • Interacted with business representatives for requirement analysis.
  • Used ETL to implement Slowly Changing Dimension to maintain historical data in Data Warehouse.
  • Developed SSIS packages for Extracting, Transforming and Loading (ETL) data from DB2, SAS Data Sets, Oracle, Excel, Flat file to PDW SQL Server by using SSIS services.
  • Created complex stored procedures, to create data tables, partitioning complex tables, views which get the metadata information from MDS.
  • Developed Dynamic SQL scripts, indexes, complex queries for data analysis and extraction.
  • Used SSIS and T-SQL stored procedures to transfer data from OLTP databases to staging area and finally transfer into target by implementing incremental loads with CDC (Change Dimension Capture) time stamps.
  • Developed dynamic SQL Procedures to auto create the table and views (dynamic SQL).
  • Worked on partitioning of tables and swapping the partitions.
  • Experience in creating drill down, drill through, parameterized, linked and matrix reports based on business requirements.
  • Developed SSRS reports from multiple data set sources, Data Warehouse using SQL Server Reporting Services (SSRS) tools like SSDT, Report Builder and Report Manager to manage the reports.
  • Performance tuning of SQL queries and stored procedures using SQL Profiler and Index Tuning wizard Partition of tables and optimized performance of SSRS reports by optimizing the datasets and reducing unwanted report parameters.
  • Created and Scheduled jobs in Dev and QA for testing purpose before moving to PROD Environment using Visual Corn Job scheduler and SQL Agent to schedule jobs.
  • Collaborated with Team for Monitoring Full/incremental/Daily loads and support all scheduled ETL jobs for batch processing, for project version control, change management and bug tracking.

Environment: Microsoft PDW Server, Microsoft SQL Server 2014/2012, SSIS, Visual Studio 2012, Erwin Modeler, FTP/SFTP, MDS,MDX, Visual Con Scheduler and T-SQL, SQL Server Agent, Oracle, Excel, Flat Files, Tortoise SVN(version control System), Report Builder.

Confidential

SQL Developer

Responsibilities:

  • Worked as SQL Developer responsible for creating database objects like tables, views, stored procedures, functions, and triggers.
  • Interacted with business and participated in requirement gathering meeting and converted into SQL stored procedures for specific project.
  • Worked on bulk import and export of data using Database tasks.
  • Worked with business user to provide Database access and permissions and provided requested data and queries.
  • Designed SSIS Packages to export data to flat-files for specific reporting needs.
  • Design and developed SSIS package for loading data from OLTP to ODS and finally to Data Warehouse.
  • Monitored, scheduled jobs from SQL server agent and provided support for jobs.
  • Provided Documentation of code changes and know issues and bug fixes for the backed end databases and jobs.

Environment: SSRS, SSAS, Visual Studio 2008, Power Pivot, Excel, T-SQL, Sales Force, SOSL, SOQL, APEX, Visual Force, Data Loader.

We'd love your feedback!