We provide IT Staff Augmentation Services!

Tibco Admin Resume

2.00 Rating

Milwaukee, WI


  • 8 years of working experience in IT industry as BI Developer, Tools including Tibco Spotfire, Tableau with strong background in Databases and Data warehousing concepts.
  • Over 3 + years of Business intelligence and analytics experience with strong experience on Tibco Spotfire to develop, design and support reports, dashboards for a variety of business users.
  • Designing and implementing BI process in diversified data environments using TIBCO Spotfire, and other tools and languages.
  • Professional experience in the Industry with strong hands on development experience in Enterprise Applications Integration (EAI), SOA (Web Service) development and B2B integration (EDI, Rosetta Net).
  • Tibco Spotfire server Version 7.0, 6.x, 5.x Spotfire Webplayer, Spotfire Automation services.
  • Installation, Administration of Spotfire Servers, Spotfire Professional client, Spotfire Webplayer, Spotfire Automation Services.
  • Experience in healthcare claims, enrollments, status notification processing and insurance claims (HIPAA EDI transactions such as 837, 834, 835, 277U, 997)
  • Extensive Experience in installation, configuration, deployment and troubleshooting ofTIBCO Active Enterprise Suiteof applicationsTIBCO Spotfire, TIBCO Business Works, TIBCODesigner.
  • Expert in developing reports with rich visualizations like Bar chart, Line chart, Graphical table, Combination chart, Scatter plot, 3D Scatter plot, Pie chart, Table, Cross table, Summary table, Tree map, Heat map and interactive features such as multiple Markings, Filtering schemes, etc.
  • Used basic Iron Python Scripting required for Spotfire Reporting development.
  • Deployed the Spotfire Library to UAT and Production environments.
  • Proficient in closely working with Stakeholders, SMEs preparing Business Requirement Document (BRD), Functional Requirement Document (FRD), Requirement Traceability Matrix (RTM), Project Scope Document, Use Cases, and MOCUP reports.
  • Good understanding of Software Development Life Cycle (SDLC) following Waterfall, Agile - Scrum
  • Data Warehousing experience using Informatica Power center 8.6.1/8.1/7.1 Warehouse Designer, Power Connect, Power Plug, Power Analyzer, ETL, DataMart,Star Schema, Snowflake Schema, Business Objects.
  • Worked with Informatica components like designer, Work flow manager, Work flow monitor, Repository Manger.
  • Involved in Design, Source to Target Mappings between sources to operational staging targets, using Star Schema, Implemented logic for Slowly Changing Dimensions.
  • Extensive Programming skills in Oracle SQL and PL/SQL, SQL Server 2005/2000 and scripting languages on windows 2003/2000/NT and UNIX Environments.
  • Experience in using UNIX shell scripts.
  • Created several scripts in UNIX for data transformations to load the base tables by applying the business rules based on the given data requirements.
  • Very good at collecting requirements, documenting the Specifications and preparing a Technical Design Document.
  • Ability to work in teams & independent with minimal supervision to meet deadlines.
  • Good communication, analytical and Interpersonal skills.


Business Tools: TIBCO Spotfire 7.0/6.x/5.x, TIBCO EMS, TIBCO HAWK, BW, Tableau, Excel, SSRS

ETL Tools: Informatica 8.x/7.x, Microsoft BI (SSAS, SSRS)

Database: MS SQL Server 2012/2008, DB2, Oracle 10g/11g

Languages: SQL, PL/SQL, HTML, CSS, XML,.Net, Java

Data Modeling Tools: Erwin 4.0/3.5, Star Schema, Snowflake Schema, OLAP, OLTP, J Developer, SQL Developer

Tools: TOAD, WinSQL, MS Office, Autosys, SQL*Plus, SQL*Loader

Operating Systems: Windows, XP,2077, UNIX, LINUX,, Vista



Tibco Admin


  • Was Extensively Involved in Understanding the Business Case and Scope of Project.
  • Was Extensively Involved in Collecting Requirements for Reports being developed.
  • Installed and Configured TIBCO Spotfire Environment.
  • Created logical measures in the Information Link of Spotfire as per the requirements.
  • Developed Customized Reports using Tibco Spot fire Professional.
  • Worked on various new Chart developments like Tree Map, Box Plot, Parallel Co-ordinate plot, Scatter Plot & Map Chart
  • Worked on complex topics in TIBCO Spot fire like Statistical Analysis, Iron Python Scripting & Property controls.
  • Architected, Implemented, and trained users on dev-to-test-to-prod Spotfire report migration process.
  • Administered Spotfire programs and supported library management.
  • Evaluated log files and implemented performance tuning.
  • Identify support issues and initiate action to resolve the problem.
  • Provide technical assistance during production issues when needed.
  • Analyze and resolve problems within the assigned application stack.
  • Execute change management activities supporting production deployment to Developers, Quality Control Analysts, and Environment Management personnel.
  • Deployed/Migrated Spot-fire DXP files till UAT/Production environments.
  • Developed User Security Groups in TIBCO Spot fire for Autanticating the Access to the tool and providing authorization to view limited data.
  • Worked on Map-Chart extensively for Spatial Reporting.
  • Scheduled updates in Spotfire WebPlayer
  • Scheduled the Auto-email jobs using Spotfire Automation Services.
  • Involved in HIPAA/EDI Medical Claims Analysis, Design, Implementation and Documentation
  • Helped in designing Data-warehouses/ETL workflow required for the reporting applications.
  • Worked with various data-sources me.e. Databases and File Systems and was successful in doing a data Mashup.

Environment: Spotfire6.5/5.x,EMS,BW 5.12.0, Spotfire Automation Services 7.0, EDI X12N 5010, Spotfire Statistics Services 7.0, Spotfire Webplayer 7.0, Spotfire ADS 7.0, Oracle11g/10g, SQL Server 2012/2008, TOAD, SQL, windows server.

Confidential, Milwaukee, WI

Tibco Developer


  • Gathering the requirements from business team and transmuting them into technical architecture and design documents.
  • Extracted the data from various data sources such as databases, CSV files, and Excel documents transforming the data into standard format and load the data so dat it is available for analyzing and creating reports.
  • Designed Spotfire Information model, Elements, Information links to create reports.
  • Installed and configured Spotfire 6.0.
  • Assisted in upgrading from Spotfire 5.0.
  • Created Groups and tan set up Users, Groups and Licenses as Spotfire Administrator and restrict the group members to access only particular folders in the library as Library Administrator.
  • Established Migration Process and moved from Lower environment to UAT and Production.
  • Developed Reports using visualizations such as bar chart, graphical table, scatter plot, combination chart, pie chart, table, cross table, and summary table visualizations.
  • Installed and configured Spotfire SDK.
  • Created Customized Reports and dashboard using Spotfire SDK
  • Provided training to users how to use the product.
  • Created technical specification documents based on the requirements Documents.
  • Knowledge sharing with testing teams perform testing on Developed reports.

Environment: Tibco Spotfire 6.0/5.x, HAWK, Tibco Webplayer, Tibco Spotfire Professional, Tibco Spotfire SDK, Oracle 10g, SQL, SQL Developer, MS Access.

Confidential, Overland Park, KS

ETL Developer


  • Created source definitions for relational and flat file sources and implemented target definitions based on star schema design.
  • Used Informatica as an ETL tool to extract data from source systems to Target system. Source Systems are mainly flat files, CSV files, DB2and relational tables and target is Salesforce.
  • Migration of existing data (flat files, CSV files, DB2) to Salesforce.com.
  • Developed Mappings using various transformations like connected/unconnected Lookups, Filter, Router, Aggregator, Expression, Stored procedure, Sequence Generator, Update strategy and Joiners etc. depending upon requirement.
  • Worked with Informatica components like designer, Work flow manager, Work flow monitor, Repository Manger,
  • Used the Informatica Designer to create complex Mappings and Mapplets.
  • Used Repository Manager to create user groups and user profiles with privileges of Administrator and Setting up the security for creating user groups and assigning privileges.
  • Used the workflow manager to create sessions and batches.
  • DesignedType1 (overwriting, no history) and Type2 (flag, version, date) mappings.
  • Implemented code migration from development to testing.
  • Tuned sources, targets, mappings and sessions to improve the performance of data load into Oracle database
  • Migrated repository from Informatica 8.6 to 9.1 and Testing.
  • Knowledge in Star Schema and Snow Flake Schema to fit reporting, query and business analysis requirements.
  • Created several scripts in UNIX for data transformations to load the base tables by applying the business rules based on the given data requirements
  • Created UNIX shell scripts for automating the data transformation process and check if standards are met.
  • Involved in scheduling and deploying these reports.
  • Developed logical models building hierarchies, attributes and facts extracting tables from warehouse
  • Used workflow manager for session management, database connection management and scheduling of jobs.

Environment: Informatica Power Center 8.6.1/8.1, DB2,Salesforce,SQL, PL/SQL,Sql Developer,WINDOWS XP,UNIX Shell Scripting.

Confidential, Boston, MA

Informatica Developer


  • Negotiating features and timeframes associated with delivering them in order to meet project / business goals.
  • Managing the deliverables and integration with the overall project plans. Coordinating the tasks across database team members and managing changes during the development cycle.
  • Worked with ETL tool to extract data from source systems to Target system. Source Systems are mainly flat files, CSV files, DB2 and relational tables.
  • Worked with Informatica components like designer, Work flow manager, Work flow monitor, Repository Manger.
  • Created reusable transformations and Mapplets and used them in complex mappings.
  • Defined Target Load Order Plan for loading data into different Target Tables.
  • Worked on programs for scheduling Data loading and transformations using Data Stage from legacy system to Teradata using Fast Load, Multi Load & Fast Export.
  • Implemented SCD methodology including Type 1, Type 2changes to keep track of historical data.
  • Involved in performance tuning of the mappings, sessions and workflows.
  • Prepared migration document to move the mappings from development to production repositories.
  • Worked on Shell and Perl scripting to develop generic wrapper scripts.
  • Involved in writing Automation Testing scripts using Shell and Perl and worked on Crontab to schedule these scripts in DEV and QA environments
  • Created UNIX shell scripts to invoke the stored procedures for the control process for the parameter file generation.
  • Involved in Unit, Integration, system, and performance testing levels.

Environment: Informatica 8.1/7.1,Oracle 10g, SQL,PL/SQL,Toad,WINDOWS,XP, UNIX.


ETL Developer


  • Extensively used Informatica to load data from Flat Files to DB2, Oracle to Oracle.
  • Created mappings using the transformations like Source Qualifier, Aggregator, Expression, Lookup, Router, Normalizer, Filter, Update Strategy and Joiner transformations.
  • Worked with Informatica components like designer, Work flow manager, Work flow monitor, Repository Manger
  • Check data consistency/accuracy/integrity
  • Developing Unix Shell Scripts to automate Data loading Process
  • Understanding the requirements of client and the existing system
  • Understanding of Informatica Mappings, Sessions and Workflows
  • Mapping business requirements to technical specifications and design logic for implementing the same
  • Development of Informatica Mappings, Sessions and Workflows
  • Test the mappings and check the quality of the deliverables
  • Test the mappings for compliance to required functionality and standards
  • Validation of migrated data

Environment: Informatica 8.1/7.1, Oracle 10g, SQL, Toad, WINDOWS, XP, UNIX.

We'd love your feedback!