We provide IT Staff Augmentation Services!

Analytics Engineer Resume

Alpharetta, GA


  • Strong ability to architech Tibco Spotfire Platform (Latest Version worked - 7.0).
  • Fluent in Object Oriented Programming Practices especially JAVA, Python/IronPython and C#.
  • High Technical expertise in Database/Business Intelligence specifically Data Warehouse Design, Data Quality, ETL, SQL.
  • Complete understanding of the Spotfire API classes frequently used for scripting and customizing
  • Ability to use all aspects of Tibco Spotfire Professional In-built features that include: Visualization (Table, Bar Chart, Line Chart, Pie Chart, Combination Chart, Scatter Plot, Box Plot, Heat Map, Summary Table, Graphical Table, Text Area)
  • Core Features (Filters, Filtering Schemes, Saving/Exporting, Marking, Details-on-Demand)
  • Statistics and Data Manipulation (Transformation, Binning, Calculated Column, Function, Similarity and Clustering, Error Bars, Lines & Curves, Regression modeling/Predictive Analysis)
  • Data Module (Loading, Merging, Adding Column/Row/Table, Relation, Information Link, Data-On-Demand)
  • Custom Expression, Function, Node Navigation Method
  • Authoring Module (Tags, Lists, Bookmarks, Collaboration Panel, Using File As Template, Dashboards, Script Control, Text Area, Coloring Concept)
  • Strong scripting skills via IronPython to customize/automate key tasks covering key API objects like:
  • Access page object via {Document.Pages & Document.ActiviPageReference}
  • Access visualization object via {Document.ActivePageReference.Visuals}
  • Manipulate properties via {Spotfire.Dxp.Data.DataProperty & Spotfire.Dxp.Data.DataPropertyRegistry}
  • Access data table via {Document.Data.Tables.TryGetValue}
  • Manipulate markings via {Document.Data.Markings}
  • Reference lines and curves via {Spotfire.Dxp.Application.Visuals.FittingModels}
  • Manipulate color via {Spotfire.Dxp.Application.Visuals.ColorAxis}
  • Access filters via {Document.ActivePageReference.FilterPanel}
  • Strong understanding of Document Model Framework and process to build Custom SDK extensions via C#/.Net that includes:
  • Custom Configured Visualization
  • Custom Visualization
  • Custom Information Modeler Tool
  • Custom Data Source
  • Custom Calculation
  • Comprehensive understanding of Information Designer encompassing following features:
  • Information Links
  • Parametrized Information Links
  • Personalized Information Links
  • Strong skills on GeoAnalytics covering following features:
  • Geocoding System (GeoAnalytics)
  • Co-ordinate Matrix system (Non-Spatial Geographical Distribution)


Confidential, Alpharetta, GA

Analytics Engineer


  • Created Information Links to load data on demand controlled by document properties, expressions, limiting and parameterization.
  • Used JavaScript d3 library to create visualizations sourcing from JSON file formats.
  • Created Map charts comprising of Map Layers, Feature Layers based on Geo-coding tables and Co-ordinate Matrix Systems.
  • Wrote IronPython scripts to customize dashboards by implementation of Tagging Analysis, Document Properties manipulation, Visualization properties manipulation, Table manipulation and file-data manipulation.
  • Used HTML and CSS along with JavaScript features to design Text Areas.
  • Used Web Player Configuration to control initial state of dashboards.
  • Used Marking and Filtering selection via IronPython scripts to automate the functionalities of dashboards.
  • Tested Usage Analysis comprising of memory usage, CPU usage, Network usage, User/Group activities and Library activities.
  • Created Calculated columns and Custom expressions to manipulate data loaded in Spotfire.
  • Wrote Hive QL to load data on demand from Hortonworks Hive Metastore.

Environment: Hortonworks, HIVE, Sql Server, Tibco Spotfire 7.0, Microsoft Office, Microsoft Outlook, DBVisualizer

Confidential, Berkeley Heights, NJ

Spotfire Engineer/ Spotfire Architect


  • Created non-spatial geographical distribution charts and map-charts to display metrics via geo-coding and co-ordinates mapping.
  • Used IronPython scripts to loop through filters, markings, visuals and Datacolumns and build customized functionalities within spotfire documents.
  • Used Spotfire Library Administrator to manage folders, .dxp files, information links, and connectors.
  • Installed and configured Sql Server connector, SAP HANA connector, SSAS connector and others to created In-db connections.
  • Used Spotfire Automation Services to create automated jobs via XML builds and scheduled it on daily, weekly basis.
  • Managed users, groups, privileges and security of spotfire environment using LDAP-Spotfire fusion.
  • Developed Custom Data Source via SDK module for Google Analytics Connection.
  • Wrote Custom Expressions for visualization axes, calculated columns. Property controls and limit expressions.
  • Created custom Text Areas via HTML scripts and JAVAScripts.
  • Verifed HDFS (Hadoop-Distributed File System) files generated via Pig Latin and JAVA Pig UDF’s to be used for metrics with Cloudera connector.
  • Used Advanced Data Services (ADS) over Cisco Composite Servers to create data virtualization layers.
  • Written MapReduce scripts via Pig Latin to reduce source web log data in Hadoop platform and used Apache Oozie as Hadoop operator.
  • Designed Information Links to mimic in-memory connections.

Environment: (Tibco Spotfire 6.5.2, Sql Server 2008, Cloudera, Apache Pig, Apache Hadoop, ADS, CIS, JAVA, Sql Server Analysis Services SSAS, SAP HANA, MS Excel, MS Word, MS Outlook, Salesforce)

Confidential, Owings Mills, MD

Spotfire Engineer/Spotfire Developer


  • Created business requirements and converted to technical specification.
  • Analyzed business needs of metrics and created statistical methods of deriving the metrics at various granular level.
  • Analyzed the feasibility of IronPython scripts to automate tasks via Custom Extension Development mode.
  • Analyzed all available SDK templates to create custom requirements via Document Model Framework.
  • Created Custom Calculations, Custom Panels and Custom Data Methods.
  • Used Package builder to deploy add-ins.
  • Create Test scripts and executed test cases at Unit, System and UAT level.
  • Converted Business Objects reports into Spotfire Dashboards making ad-hoc reporting mode prioritized.
  • Analyzed PL-SQL stored procedures for source data assessment and information links query assessment.
  • Developed bridge tables to link tables in existing datawarehouse.
  • Created Custom Expressions, Information Links and Calculated Columns.

Environment: Tibco Spotfire 4.X/5.X, MS Office, MS Access, Business Objects, Oracle, PL-SQL

Confidential, Houston, TX

Spotfire Technical Consultant/Spotfire Developer


  • Created business requirements, functional requirements, and technical specifications to develop metrics for business needs.
  • Conducted business analysis in order to assess the feasibility of Spotfire functionalities to conduct statistical manipulation on data.
  • Converted existing Business Objects Reports into Spotfire Dashboards.
  • Conducted Quality Assurance Testing for underlying Spotfire API codes.
  • Packaged, Deployed and Refactored custom extensions.
  • Trained End Users on various features of Spotfire Professional Client tool.
  • Created AddIns for Data Methods including Row Methods, Column Methods.
  • Trained developers on various Client APIs and Document Model Framework.
  • Analyzed source data from various sources using analytical Sql.
  • Monitored Transformations and Workflow in Informatica.
  • Assisted analysts in Data Quality work and cleansing.
  • Conducted weekly meetings to discuss project status.

Environment: Tibco Spotfire 4.X, Informatica, Business Objects, Microsoft Office Package, Oracle Toad, In-built tools for communication/requests


Data Quality Technical Consultant


  • Monitored weekly AutoSys Jobs in Dev, Qual and PROD regions. Created Sprint Logs for Agile mode of work. Created Job file (in XML module) to automate the reporting tasks.
  • Created Technical Specifications for modification of Informatica mappings. Created Technical specification to list data profile including constraints, indexes, views, sequences, synonyms, cardinality/optionality, measures (metrics), dimensions, referential integrity, aggregations, calculations, transformations, statistical modeling in order to pre-design data load from source to target.
  • Monitored Informatica session logs in order to assess and rectify data cycle errors to address data quality issues.
  • Assessed existing data marts and designed bridge tables to connect different data tables.
  • Modified Source Qualifier queries in Informatica mappings.
  • Used Analytic Functions, Statistical Functions via SQL to analyze and process the data at aggregate level including roll-ups, cubes, grouping sets.
  • Used DML and DDL statements via SQL to manipulate and define data sets.
  • Used joins, subqueries and set operators via SQL to cross- analyze data across multiple tables.
  • Used Substitute Functions, Character Functions, Numeric Functions, Conversion Functions, and Date-Time Functions, Pivots, Hierarchies, Aggregation via SQL to modify data column values.
  • Used Data Security languages and Data Transaction Control languages via SQL.
  • Written test strategies, test cases to asses Informatica code changes including initial code debug.
  • Converted Business Objects Ad-hoc reports to dashboards and visualizations (pie chart, bar chart, pivots, line charts, cross-tables, and heat-map).
  • Maintained source data flow using Excel Macros, Flat Files, Relational Tables. Accessed files from Mainframe System and used in Informatica ETL.
  • Created Technical Specification for ‘Insert Else Update’ Process as part of Slowly Changing Dimensions.
  • Used Data Profiling Services in Sql Server 2008 to document Column Null Ratio Profile, Column Statistics Profile, Column Value Distribution Profile, Column Length Distribution Profile, and Column Pattern Profile.
  • Reviewed Informatica PowerCenter Designer mappings, Informatica Workflow Manager for workflows/sessions, Autosys Job Scheduling, Informatica Workflow Monitor for session logs.

Environment: SQL Server 2008, Oracle, Informatica, MS Office, Toad, Sql Server Management Studio, C#, JAVA, PL/SQL, T-SQL, UNIX

Hire Now