We provide IT Staff Augmentation Services!

Sr Pentaho Developer Resume

SUMMARY

  • 11+ Years of experience in full life cycle development involving analysis, design, development, deployment, testing, implementation maintenance and Support of large data warehouse applications in Web - based and Client/Server environment.
  • Experience in building Data warehouse systems and Business Intelligence systems including Pentaho ETL and Cognos
  • Hands on experience on Datawarehouse Star Schema Modeling, Snow - Flake Modeling, Fact & Dimension Tables, Physical and Logical Data Modeling.
  • ETL and BI experience using pentaho studio.
  • Developed mappings in Pentaho Data Integration (PDI) ETL tool to load the data using various steps including Row Normalizer, Row Denormaliser, Database Lookup, Fuzzy Lookup Database Join, Calculator, Add Sequence, Merge Join, Insert/Update.
  • Experience in integrating XML, JSON format files into Datawarehouse
  • Experience in Realtime Data process into Datawarehouse database.
  • Experience in High Volume data into Destination using Batch Jobs.
  • Pentaho ETL (PDI) tool upgradation 3.2 to 7.0 and 8.1
  • Reports Migration from SSRS to Pentaho
  • Experience in resolving on-going maintenance issues and bug fixes; monitoring PDI sessions as well as performance tuning of mappings and sessions.
  • Experience in Data Encryption
  • Work closely with business analysts, end users, and QA team to ensure technical compatibility
  • Expertise in Administration, designing, development and deployment of Data Warehousing and Business Intelligence applications using COGNOS 8, 10 and Cognos Analytics 11.
  • Experience in Framework Manager Modeling (Physical Layer, Business Layer, Packages).
  • Experience in migrating Reports & Packages from Cognos 10 series to Cognos Analytics 11.
  • Knowledge in AWS concepts

TECHNICAL SKILLS

Business Intelligence Tools: IBM Cognos Analytics 11, Framework Manager, Power BI, Pentaho Report Designer, SSRS.

ETL Tools: Pentaho (PDI), SSIS.

Databases: SQL Server, Oracle, DB2, Amazon Redshift

DB Tools: SQL*Plus, SQL*Loader

Web Servers: IIS, Tomcat and Apache

Programming skills: HTML, JavaScript, SQL, Python, Unix Shell scripting

PROFESSIONAL EXPERIENCE

Confidential

Sr Pentaho Developer

Responsibilities:

  • Created Data mappings for source data coming from APIs, Table input, Text file and Excel input.
  • Prepared interval data for measurement table for legacy data.
  • Canonical files generation using Gzip for newly added data and pushed to C3 flatforms (Analysis).
  • Used bunch of steps in Pentaho transformations including Database Lookup, Database Join, Calculator, Add Sequence, Add Constants and various types of inputs and outputs.
  • Analyzed data sources,identified the issues and address the same that can impact the ETL process.
  • Automated file transfer processes to Archive location after data load into datawarehouse tables.
  • Created SCD2 type dimensions and using Change Data Capture loaded incremental data into staging tables.
  • Working on Bulk dataload into Fact table, using batch jobs.
  • Supporting the business/Reporting by making enhancements to existing flow and providing post implementation product support.
  • Created reports and dashboards using structured and unstructured data using formulas calculated fields.
  • Deployed reports on Pentaho BI Server to give central web access to the users.
  • Implemented security in Pentaho reports by assigning permission to specific users to view the reports.
  • Performed the code review by taking the transformations from offshore team.
  • Troubleshoot issues with failed ETL process in Production environment.
  • ETL Deployments

Environment: MS SQL Server 2016, Windows 2016, Pentaho Kettle (PDI) 8.3, Pentaho Report Designer 7.1, SSRS, SVN.

Confidential

Sr BI Developer

Responsibilities:

  • Created Data Flow Mappings to extract data from source system and Loading to Target.
  • Used Pentaho Data Integration/Kettle to design all ETL processes to extract data from various sources including live system and external files, cleanse and then load the data into target data warehouse.
  • Created transformations that involve configuring the following steps: Table input, Table output, Text file output, CSV file input, Insert/Update, Add constants, Filter, Value Mapper, Stream lookup, Join rows, Merge join, Sort rows, Database Lookup, Set Environment Variables.
  • Created and saved Pentaho jobs in enterprise repository and scheduled them to run in production on Daily/weekly basis.
  • Used dimension lookup/update step to populate data into SCDs.
  • Used the Pentaho Enterprise Repository to create folders, store transformations and jobs, move, lock, revise, delete, and restore artifacts.
  • Scheduled meetings with Senior Data Leads and analyzed the data.
  • Developed Stored Procedures to get the data from various Facets and load into temporary tables.
  • Prepared the documents for all the modules developed.
  • Used Pentaho Report designer to create various reports having drill down functionality by creating Groups in the reports and drill through functionality by creating sub-reports within the main reports.
  • Created single value as well as multi-value drop down and list type of parameters with cascading prompt in the reports.

Environment: MS SQL Server 2012, Window 2012 Server, Pentaho Kettle (PDI) 8.1, Pentaho Report Designer 7.1, Tidal, SVN, JIRA, Dart, Code Warden, SSRS and Cognos

Confidential

ETL Developer

Responsibilities:

  • Interacted with the Business Analysts to understand the process flow and the business.
  • Actively participated in team to gather requirements to develop this BI project and participated in designing Physical and Logical of Data warehouse.
  • Created Data Flow Mappings to extract data from source system and Loading to Target.
  • Large Volume of Data loading into Warehouse tables using Batch ETL Process.
  • Used Pentaho Data Integration to design all ETL processes to extract data from various sources including live system and external files, cleanse and then load the data into target data warehouse.
  • Created transformations that involve configuring the following steps: Table input, Table output, Text file output, CSV file input, Insert/Update.
  • Created and saved Pentaho jobs in enterprise repository and scheduled them to run in production on weekly basis.
  • Used dimension lookup/update step to populate data into SCDs.
  • Experienced in performing Data Masking/Protection using Pentaho Data Integration (Kettle).
  • Dealt with slowly changing dimensions type 1&2.
  • Involved in Prod Support to research and resolve the daily load issues.
  • Worked on ETL flow documentation.
  • ETL Deployments.

Environment: Pentaho (PDI ETL) 7.1, SQL Server 2014, Oracle 10g, TOAD, XML, SQL, PL/SQL, MS Excel, Shell Scripting, SVN

Confidential

ETL Developer/Pentaho

Responsibilities:

  • Created user accounts in Pentaho Enterprise Console for end users/Business Analysts who were supposed to view the reports using Pentaho User Console.
  • Created mapping documents to define and document one-to-one mapping between source data attributes and entities in target database.
  • Used Pentaho Data Integration to create all ETL transformations and jobs.
  • Used different types of input and output steps for various data sources including Tables, Access, Text File, Excel and CSV files
  • Encrypt the User information.
  • Identify and analyze data discrepancies and data quality issues and works to ensure data consistency and integrity.
  • Implemented Slowly Changing Dimension Type 1 and Type2 in ETL jobs for certain Dimensions.
  • Wrote Shell scripts in UNIX and PL/SQL scripts to automate daily routine jobs for production databases.
  • Modified existing Oracle PL/SQL code of stored procedures, functions, and packages.
  • Saved Pentaho jobs in enterprise repository and scheduled them to run in production on daily basis.
  • Cognos Analytics 11 Installation in Linux machines
  • Cognos Gateway Configuration with Apache Webserver
  • User roles creation in Cognos.
  • Cognos admin activities like daily backups, logs checking/cleaning, validating data source connections.
  • Cognos packages publishing from Framework Manager, and reports deployment (Development to Test, Pre-production and Production).
  • Reports scheduling from Cognos portal per requirements.
  • Validating the packages and all the reports associated to the packages.
  • Tuning the queries to improve the report performance.
  • Creating complex reports that involve query prompts, layout calculations, conditions, and filters in Report Studio.
  • Improved the performance and usability of the reports by creating appropriate prompts and filters.
  • Developed Multi page reports with prompt pages and conditional variables according to the end user requirements.
  • Reports integration into third party Applications.

Environment: Cognos10.2, Oracle 12c, PowerPoint, Microsoft Excel, Microsoft Word, Erwin, Cognos Framework Manager, Pentaho Kettle PDI 6.1, SVN.

Confidential

Senior BI Analyst

Responsibilities:

  • Developed ETL process using Pentaho PDI to extract the data from legacy system.
  • Worked with DBAs to archive the old data to get more performance on the loads.
  • Scheduled the jobs using Pentaho Scheduler.
  • Provided technical support to other developers and coordinated escalation of issues.
  • Built Relational Models for Reporting and DMR for OLAP Style Analysis and Reporting using multi layer Approach and Modeling best Practices to meet the reporting requirements.
  • Involved in Design and Development of Security Architecture to meet the Business needs.
  • Ensured Ambiguous Relationships are resolved and implemented Model enhancement techniques to improve Model Performance.
  • Created Model Query Subjects, Data Source Query Subjects, Stand alone and Embedded Filters to implement the Complex business Logic. Implemented internal Cognos Security - Column Level, Row Level Security using Security tables to secure sensitive data with Security Filter, Macros, Session Parameters and Parameter Maps.
  • Developed Hierarchies and Measures to support Adhoc Reporting, Multi Dimensional Reporting.
  • Published Packages to Cognos Connection and implemented Package Level Security.Managing User ID’s and User groups like Providing Privileges to the users and User Groups in Access Manager.
  • Using Transformer cubes as data source import into Framework Manager, and created Models, packages for Analysis Studio.
  • Developed Report Templates for Standard and Adhoc reporting to make the Reports look Consistent.

Environment: MS SQL Server 2008, Oracle 11g, Cognos 8.4, Cognos 10, Cognos Framework 8,10, Unix, Window 2008 Server, Pentaho Kettle (PDI) 3.2, 4.1.

Hire Now