We provide IT Staff Augmentation Services!

Etl/ Qliksense Report) Developer Resume

2.00/5 (Submit Your Rating)

Atlanta, GA

PROFESSIONAL SUMMARY:

  • Strong IT Experience in ETL & Report Designing in Business Intelligence space
  • Experience in ETL using Pentaho, SAP BO Data Services and SSIS
  • Experience in development of reports/dashboards using Qliksense/ Birst reporting tool/ Pentaho Report Designer
  • Experience on all phases of Software Development Life Cycle (SDLC) of Data warehousing, including requirement gathering, analysis, design, source to target data mappings, ETL and report development
  • Experience in RDBMS like Oracle, PostgreSQL and SQL Server
  • Expertise in understanding and analyzing logical/physical data models for data warehousing using Ralph Kimball Dimensional Modeling techniques such as Star & Snow Flake Schemas, Fact (Transactional, Summary/Aggregate, Late Arriving), Dimensions (SCD, Hierarchies, Junk, Mini, Rapid Changing, Role playing)
  • Experience in writing complex SQL codes to validate ETL/Report data
  • Experience in developing ETL’s using Pentaho community version 5.0 and Pentaho Enterprise Edition 5.4 using transformation steps like Table Input, Select Values, Add Constants, Group By, Sort Rows, String Operations, Merge Rows (Diff), Update and Table output for initial and incremental loads of SCD Type 1 & 2
  • Experience in design, development and migration of ETL jobs/workflows/data flows using Business Objects Data Services (BODS) and transformations like Key Generator, SQL Query Transform, Table Comparison, Merge, Case, lookup, Map Operator, History Preserve Transformation (Slowly Changing Dimensions Type - 2)
  • Experience in source systems analysis and data extraction from various sources like Flat files, Oracle, DB2, MS SQL Server using SQL Server Integration Services (SSIS)
  • Experience in debugging & unit testing ETL code for issues/performance and producing validation documentation
  • Ability to communicate with end users to transform business requirements/KPI metrics into Pentaho/ Birst reports and/or dashboards
  • Experience in Birst BI Data Modeling (Admin), Live Access, Reporting (Designer/Visualizer/Dashboards) and space management
  • Experience in Birst Advanced data modeling techniques in creating Scripted Sources, Hierarchies, Levels, Variables, Custom Attributes & Measures and Packages
  • Experience in configuring Birst data source tables for snapshot or incremental load processing using six ETL scenarios
  • Experience in using Birst Connect to create tasks and sources
  • Experience in loading heavy loads to Birst using Birst Connect scripts
  • Design/developed Birst reports/dashboards using Designer/Visualizer/Dashboard 2.0 reports using different report designs like results view, pivot control, sort, filter, drill behaviors and charts
  • Experience in space administration and security like user/group management, SSO and report scheduling
  • Ability to troubleshoot ETL/Report issues and perform the root cause analysis
  • Expertise in DBA and/or SQL performance tuning tasks
  • Good communication skills, adaptable to new software technologies, quick learner and ability to work independently

TECHNICAL SKILLS:

ETL Tools: Pentaho (Kettle), SSIS, SAP Data Services 4.0 (BODS)

Databases: Oracle, PostgreSQL, MS SQL Server 2008R2/2005/2000

Database Tools: PL/SQL Developer, PGAdmin III, SQL Server Management StudioQuery Analyzer:

Programming Language: SQL, PL/SQL, Java Script (Pentaho)

Reporting Tools: Birst Reporting, Qliksense and Pentaho BI

WORK EXPERIENCE:

Confidential, Atlanta, GA

ETL/ Qliksense Report) Developer

Responsibilities:

  • Involved in requirement gathering for building ETL’s using Pentaho Data Integration (Kettle)
  • Involved in converting SSIS ETL’s and Sqlserver Stored Procedures into Pentaho ETl’s
  • Good in optimizing the Pentaho jobs and data validation tasks
  • Developed various transformations in Pentaho DI using steps like Table Input, Select Values, Modified Java Script, Database Lookup, Stream lookup, Group By, Sort Rows, String Operations, If Field Value is Null, Execute SQL Script and Table output steps
  • Involved in understanding and review the requirements for converting BO Reports into QLiksense
  • Involved in creating reports using Qliksense

Environment: Pentaho 5.2 Enterprise Edition, Oracle 11c, Sqlserver, Qliksense Reporting 3.1, Pentaho Report Designer

ETL/Report Developer

Confidential, NY

Responsibilities:

  • Involved in all BI phases of implementation of Analytics2 project for Confidential using MirthResults CRM, Pentaho ETL, PostgreSQL and BIRST Cloud reporting
  • Followed Ralph Kimball methodology for staging data into database to load to Birst as the data volume exceeded 1.5 TB’s
  • MirthResults CRM system is used as source database which runs on PostgreSQL
  • Involved in requirement gathering for ETL’s using Pentaho for Type1 & Type 2 dimensions
  • Involved in developing the ETL Architecture for Analytics 2 project i.e. Priority/Parallel/Sequential loads using control tables
  • Created Source-to-Target data mappings for Analytics2 Clinical and Operational Warehouse as per the requirements gather by data modeler
  • Developed ETL’s using Pentaho community version 5.0 for loading data into staging tables
  • Different strategies are used to populate the data depending on the size of the table like truncate reloads and/or CDC’s/Deltas as per the data model requirements provided
  • Different techniques are used in Pentaho to implement Type2 dimension without using lookup due to performance issues of the database
  • Developed various transformations using steps like Table Input, Select Values, Group By, Sort Rows, String Operations, If Field Value is Null, Execute SQL Script, Filter Rows, Merge Rows(Diff) Update and Table output steps
  • Involved in validating the data for initial and incremental loads and produced unit test & validation documents to the client as part of unit testing
  • Involved in optimizing the Pentaho jobs and data validation tasks
  • Created shell scripts to execute and process the Pentaho ETL’s & Birst loads using Birst Connect
  • Involved in support activities for the ETL post production
  • Using Pentaho report designer, designed various report types like line chart, bar charts (horizontal, vertical, stacked), pie charts, sub reports and drill down reports
  • Expertise in creating adhoc reports using pentaho report analyzer
  • Expertise in advance report features like passing parameters to reports/sub reports, create reports using cubes/PDI/Excel as source, creating report hyperlink to call for other reports
  • Expertise in Pentaho Cube creation and Deployment using Schema work bench
  • Communicate reporting changes, enhancements, and modifications - verbally or through written documentation to management and other employees so that issues and solutions are understood
  • Birst responsibilities include modeling the data and setting hierarchies/grains & targeting the columns as either measures/attributes/analyze by date, setting data types & column lengths
  • Involved and developed the snapshot or incremental policies for each Birst data sources
  • Experience in creating Scripted Source (ETL data sources/derived sources)
  • Involved in requirements gathering of reporting for Operational reports
  • Designed and developed the operational reports using Birst Designer/Dashboards for Clinical and Operational spaces
  • Validated the report data with the staging tables data using SQL’s in PostreSQL database as part of UAT
  • Experience in creating the custom subject area
  • Experience in Birst space customization: Custom Measures, Custom Attributes, Bucketed Measures, Variables, Packages, Aggregates and Drill maps
  • Knowledge in Birst Admin activities like Space Creation, Adding/Managing User(s) & Report Schedules
  • Experience in designer/visualizer module in creating reports, using the default or custom subject areas
  • Experience in creating Birst ad-hoc reports using results view, pivot control, sort, filter, drill behaviors and charts
  • Involved in design, analysis & reporting in HL7 quality project in Birst environment using Live Access
  • Developed/tested/validated the HL7 live access scenario as one of the options
  • Validated the HL7 report data with staging tables as part of unit testing

Environment: Pentaho 5.0 Community Edition, PostgreSQL 9.x, RedHat Linux, MirthResults CRM, Birst Reporting, Pentaho Report Designer

Confidential, NJ

ETL Developer

Responsibilities:

  • Responsible for designing, developing and migrating ETL jobs using Business Objects Data Integrator XI (BODS) from Talend ETL tool
  • Involved in upgrading the ETL’s to automatically process the files using one BODS job for different hospital using same EMR while the existing Talend ETL’s are replicated for each hospital processing
  • Consolidated the ETL’s and created ETL metadata layer to fulfil the requirements of different hospitals using the EMR system as per the requirements
  • Automated the entire ETL system to process all the data files produced by different EMR’s from different Hospitals
  • Validated and tested the ETL’s using SQL in SQL Server database
  • Used FTP process to extract the files from different hospitals sites and process the files using the BODS jobs
  • Worked on various BODS transformations like Data Integrator transforms (Date Generation, History Preserving, Key Generation, Pivot & Reverse Pivot, Table Comparison), Platform transforms (Case, Map Operations, Merge, Query, SQL) and other functions available in the tool

Environment: BODS 4.0, Talend, MS SQL Server DB, Windows 2008 Server

Confidential, Tampa, FL

ETL Developer

Responsibilities:

  • Responsible for preparing ETL requirement & technical specifications along with source to target data mapping document
  • Used SAP Data Services 4.0 for migrating data from transactional database to the Data Warehouse
  • Created Jobs, Workflows and Data Flows for dimension and fact tables per specifications and implemented the business logic for both initial and incremental loads
  • Created SAP Data services mappings to load the data warehouse, using transformations like Key Generator, SQL Query Transform, Table Comparison, Merge, Case, lookup, Map Operator, History Preserve Transformation (Slowly Changing Dimensions Type-2) etc. in Data flows
  • Loaded summary tables per the functional requirement specifications for reporting
  • Tuned Transformations for the better performance of jobs
  • Handled Slowly Changing Dimensions Type 1 & 2
  • Analyze all tables and created various indexes to enhance query performance
  • Involved in validating the ETL’s while migrating them from Development, QA and Production environments and provided the validation documents for review
  • Identifying the data issue occurred in ETL loading and rectified the issues.
  • Attended weekly status call for updating the assigned task status and issue updates

Environment: SAP Data Services 4.X, SQL Server 2008

Confidential

SSIS Developer

Responsibilities:

  • Collaborated with team in design, development and testing of ETL strategy to populate the data from various source systems feeds using SSIS and creating Business requirement documents (BRD), as per the direction of Data Architect and business users
  • Documented the ETL process and mapping documents used to load data from source systems to target.
  • Creation of database objects like Tables, Views, Indexes, Constraints, Temporary tables etc. as per the requirement
  • Extensively used joins and sub queries to simplify complex queries involving multiple tables
  • Used SQL Profiler to monitor the server performance, debug T-SQL and slow running queries
  • Hands on Experience in installing, configuring, managing, monitoring and troubleshooting SQL Server 2000
  • Created ETL packages using SSIS to move data from heterogeneous data sources and load data to data warehouse
  • Extensively used various SSIS Objects such as Dataflow Components, Control Flow Elements, Connections Managers, Runtime Events, and Log Providers etc.
  • In Dataflow components, used Slowly Changing Dimension, Multicast, Merge Join, Lookup, Fuzzy Lookup, Conditional Split, Aggregate, Derived Column, Data Conversion and others
  • Also, used System Variables and creating User-Defined Variables for SSIS packages
  • Implemented Event Handlers and Error Handling in SSIS packages and notified process results to various user communities
  • Configured SQL mail agent for sending automatic emails when a SSIS packages is failed or succeed.
  • Created logs for ETL load at package level and task level to log number of records processed by each package and each task in a package using SSIS

Environment: MS SQL Server 2005, SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS), SSAS, MS Visual Studio.NET# 2005, T-SQL

We'd love your feedback!