We provide IT Staff Augmentation Services!

Bi & Bigdata Developer Resume

0/5 (Submit Your Rating)

Seattle, WA

SUMMARY

  • Over 8 years of experience in IT industry with an expertise in Business Intelligence and Data Warehousing and over 3 years of experience in Big Data environments
  • Involved in the Analysis, Design, Architecture and development of software applications under different environments. Have strong hands on experience in developing and implementing BI/Data Warehousing projects using OBIEE & Informatica.
  • Good understanding and knowledge on different stages of the Software Development Life Cycle (SDLC).
  • Strong exposure to SQL & RDBMS: Oracle, MS SQL Server.
  • Extensively worked on ETL implementations using Informatica Power Center.
  • Strong Data Modeling skills using Toad and Erwin.
  • Extensive Business Intelligence experience using Oracle BI Applications 7.9x, OBIEE 11g, Informatica and DAC.
  • Worked Extensively on OBIEE Administration Tool, Presentation Services, Answers, Interactive Dashboards and Security Implementation.
  • Extensive experience in building Interactive Dashboards with drill - down & drill-across capabilities using global & local Filters, Guided Analytics, Business Rules with User Identity & Roles, Profiles & Personalization.
  • Very good in Debugging, Optimizing and Performance Tuning of Oracle BI Analytical Dashboards / Reports.
  • Expertise in providing end-to-end business intelligence solution by configuring metadata & building Analytics Repository, dimensional modeling design, building business models, generating reports, and creating Inter-active Business Intelligence Dashboards using Oracle Business Intelligence.
  • Worked Extensively in Creating Users, Groups, Initialization Blocks in Oracle Admin Tool to set up Security using LDAP and VPD’s.
  • Experience in Hadoop Ecosystem - HDFS, Map Reduce, YARN, TEZ, Hive, Pig and Sqoop and Big Data Concepts.
  • Hands on in MapReduce Programming.
  • Good understanding / knowledge of Hadoop architecture, HDFS architecture and various components such as HDFS, Job Tracker, Task Tracker, Resource Manager, Node Manager, application Master, Namenode, Datanode and YARN resource Manager.
  • Performed data analysis using Hive and Pig.
  • Experience in Hive Query Performance Tuning and optimizing different techniques.
  • Worked on streaming data using Apache Flume and migrating the data from Oracle to Hadoop HDFS using Sqoop.
  • Worked in various verticals like Finance, Supply chain, Pharmaceuticals and Media industries.
  • Good Team player and an individual contributor with steep learning curve and good interpersonal skills.

TECHNICAL SKILLS

Business Intelligence Tools: OBIEE 10.x/11g, OBI Apps 7.9.6, Cognos 8.0 Suite/Report Net.

ETL Tools: Informatica Power Center 8.5/9.1/9.5

Big Data Ecosystems: Hadoop, MapReduce - Framework 1 & 2, Pig, Hive, HBase, Oozie, Sqoop, Flume

Operating Systems: Unix, Linux, Windows Server 2003/2008

Databases: Oracle 9i, 10g, SQL Server 2000/2005, DB2 8.x, Teradata, MongoDB, HBase

Languages: SQL, PL/SQL,T-SQL, C, C++, Java, J2EE

Web Techniques: HTML, XML, CSS, XSLT

PROFESSIONAL EXPERIENCE

Confidential, Seattle, WA

BI & BigData Developer

Responsibilities:

  • Used Hadoop architecture with Map Reduce functionality and its ecosystem to solve the customer requirements using Cloudera Distribution for Hadoop (CDH3).
  • Analyzing the functional specs provided by the client and developing detailed solution design document with the Architect and the team.
  • Discussing with the client business teams to confirm the solution design and changing the requirements if needed.
  • Used Hadoop architecture with Map Reduce functionality and its ecosystem to solve the customer requirements using Cloudera Distribution for Hadoop (CDH3).
  • Streaming the data from their switches using Flume.
  • Parsing the data using Map Reduce functions written in Java as per the solution design and processing the information per subscriber.
  • Used Pig Latin scripts to process further process and store the data in HBase using Java API.
  • Writing customized User Defined Function’s (UDF) to process further in Java and Perl to ease the processing in Pig.
  • Storing the data in NoSQL database Hive from HBase.
  • Migrating the data to Oracle from Hive using Sqoop.
  • Creating partitions and clusters in Hive.
  • Writing the recurring workflows using Oozie to automate the scheduling flow.
  • Addressing the issues occurring due to the huge volume of data and transitions.
  • Migration of database objects from previous versions to the latest releases using latest data pump methodologies, when the solution was upgraded.
  • Developed Spark scripts by using Scala Shell commands as per the requirement.
  • Developed and implemented core API services using Scala and Spark.
  • Writing the design and technical documentation of the solution for the client.
  • Developing the solution on the Fusion Works platform for the change requests and for fixing the solution production issues.
  • Populated HDFS with huge amounts of data using Apache Kafka.
  • Used the OBIEE 11g Administration tool to develop and enhance meta-data.
  • Designed, developed and tested 100+ Informatica mappings/sessions/workflows to move data from source to staging to Analytics warehouse.
  • Created connection pools, physical tables, defined joins and implemented authorizations in the physical layer of the repository.
  • Configured and scheduled DAC tasks as a part of loading the ORAEIMD (DEV), ORAEIMU (UAT) and ORAEIMP (PROD) Data warehouse.
  • Configured and Created repository using OBIEE Admin tool.
  • Worked extensively with Physical Layer, Business Layer and Presentation Layer.
  • Created numerous executive level reports. ‘Dashboards’ were generated to provide insight into the sales/marketing/financial data.
  • Designed Schema/Diagrams using Fact, Dimensions, Physical, Logical, Alias and Extension tables.
  • Created different types of Customized Reports (Drilldown, Aggregation) to meet client requirements.
  • Built Business model and established relationships & Foreign Keys (Physical & Logical) between tables.
  • Created pre- built reports with BI- Publisher.
  • Created business reports using BI Answers as per requirements.
  • Generated various Analytics Reports like Formulary Report and Direct Sales Report using global and local filters.
  • Involved in modifications of physical, business and presentation layers of metadata repository using Siebel Analytics Administration Tool.
  • Developed custom reports/Ad-hoc queries using Seibel Answers and assigned them to application specific dashboards.
  • Developed different kinds of Reports (pivots, charts, tabular) using global and local Filters.

Environment: OBIEE 11g, OBI Apps 7.9.6, UNIX, Oracle Database, TOAD, Informatica 9.1, DAC, OBI Publisher, CDH3&4, PIG(0.8.1), HIVE(0.7.1), Sqoop (V1), Oozie (V2.3.2), Hbase, MapReduce, YARN, Apache Kafka.

Confidential, NH

Sr. BI Developer and BigData Developer

Responsibilities:

  • Developed data pipeline using Sqoop, Pig and Java map reduce to ingest customer behavioral data and financial histories into HDFS for analysis.
  • Worked on importing and exporting data from Oracle into HDFS and HIVE using Sqoop for analysis, visualization and to generate reports.
  • Developed Hive UDF to parse the staged raw data to get the Hit Times of the claims from a specific branch for a particular insurance type code.
  • Schedule these jobs with workflow engine like Oozie.
  • Built wrapper shell scripts to hold these Oozie workflow.
  • Provided ad-hoc queries and data metrics to the Business Users using Hive, Pig.
  • Used Pig as ETL tool to do transformations, event joins and some pre-aggregations before storing the data onto HDFS.
  • Used Hive to analyze the partitioned and bucketed data and compute various metrics for reporting.
  • Used Hive and created Hive tables and involved in data loading and writing Hive UDFs.
  • Created concurrent access for hive tables with shared and exclusive locking that can be enabled in hive with the help of Zookeeper implementation in the cluster.
  • Identified the granularity level of the data required to be available for analysis.
  • Developed Metadata repository (.rpd), configured metadata the Business Model.
  • Layer & Physical Data Model and Web Catalog Objects as per the reporting requirements using OBIEE Admin and Web tool.
  • Worked on Oracle BI Answers to build Interactive Dashboards with drill-down capabilities.
  • Mapped the configurations to accommodate for the territorial hierarchy that were accommodated via chart drilldowns.
  • Assisted in creation of pivot table request allowing manager to analyze the business from different perspectives provided optimized Analytics request using column selector.
  • Implemented dynamic dashboard prompts to zoom into particular segments of the business in a performance-optimized manner.
  • Administered security and created alerts.
  • Created templates for presenting results and Analytics and modified the OBIEE Dashboard using cascading style sheets.
  • Involved in Reconciliation Process while testing loaded data with user reports.
  • Developed different kinds of Reports (pivots, charts, tabular) using global and local Filters.

Environment: Hadoop, MapReduce, Hive, HDFS, PIG, Sqoop, Oozie, Cloudera, HBase, Zookeeper, CDH3, MongoDB, OBIEE 11g, Netezza, Oracle11i, Oracle SQL developer, datastage, Linux.

Confidential, Schaumburg, IL

Sr. BI/ETL Developer

Responsibilities:

  • Interacted with Client Services Management business representatives for gathering the Reports / Dashboards requirements and to define business and functional specifications.
  • Identified the GAPs by analyzing the CS Data mart instance and provided alternative solutions as well recommended changes to database.
  • Design and development of repository (Creating / Modifying objects in Physical, Business and Presentation layer).
  • Created Dashboards by embedding reports and providing intuitive drilldowns and links to exploit the full benefit of Analytics.
  • Security management for users, groups and web-groups using the various authentication systems such as LDAP, OS, Database and Database table authentication using Session Variable features, as well as Dashboard / Report-level security.
  • Implementation of criteria filters to narrow-down the result set from analytics requests.
  • Developed/Bench-Marked performance tuning strategies with business model-level calculations versus Data warehouse level pre-computations and aggregations.
  • Performed Unit testing of various Dashboards / Reports.
  • Involved with ETL and SQL in updating database tables and getting in new customized tables when necessary.

Environment: Oracle Business Intelligence (OBI) EE 10.x, Informatica 7.x, Oracle9i/10g, Oracle BIPublisher, Toad, Putty, Linux.

Confidential, NJ

BI Developer

Responsibilities:

  • Requirement analysis, Designed and implemented custom business intelligence solutions (Metadata and Dashboards/Reports) using the Siebel / Oracle Business Intelligence Enterprise Edition (OBIEE) platform.
  • Performed the detailed design of Reports / Dashboards, Analytical Data Model (Metadata / .rpd) and performed Fit/Gap Analysis between existing Sales, Call center reports and new requirements.
  • Developed the repository model for the different work streams with the necessary logic that involved creating the Physical, BMM and the Presentation layer, customized the existing model and also build new models which were not accomplished by OTB models. Created hierarchies, variables, and event poll strategies in the repository.
  • Developed and customized reports and new Reports, cosmetic changes to existing Dashboards with different subject matter Views (Drill-Down / Dynamic, Pivot Table, Chart, Column Selector, Tabular with global and local Filters) using Siebel Analytics Web / Oracle Business Intelligence (OBI) Presentation Services for the various work streams and sales force.
  • Enhance performance of Reports / Dashboards by implementing the Aggregate tables, Materialized Views, Table execution path by making use of hints, Creation / Re-building of Indexes and managing Cache etc.
  • Implemented Security by setting up External Table Authentication, creating and integrating Session Variables, and Initialization Blocks.
  • Performed upgrades from Siebel Analytics 7.x to OBIEE 10.1.x.
  • Configured and automated iBots to deliver Analytics content.
  • Performed the benchmark and SLAs for up gradation process and runtime of OTB and new reports while doing migration from Siebel Analytics to Oracle BI.
  • Customized OTB mappings by incorporating new extended columns in the Siebel CRM to the Siebel DWH.
  • Prepared the Test Plans and Test cases for validating the reports/dashboards.
  • Written many SQL statements for validating the reports and used the same in Test documents.
  • Created new mapping for extension and bridge tables that were not supported in OTB.

Environment: OBIEE 10.x, Informatica, PowerCenter 7.x, Oracle10g, Linux, TOAD.

Confidential, Plano, TX

BI Developer

Responsibilities:

  • Coordinated with business users to understand the business rules and working with data warehousing team and also coordinated with the front end application development team
  • Documented the Requirements and implemented a Design Solution for the BI Architecture.
  • Involved in the Installation and Configuration of Siebel Analytics Suite.
  • Worked closely with the ETL and Database Team to coordinate the Necessary Data movement.
  • Worked extensively on the Siebel Admin Tool to create the Physical, Business and Presentation Layers.
  • Designed Schema/Diagrams using Fact, Dimensions, Physical, Logical, Alias and Extension tables
  • Created various Analytic Reports using Global and Local Filters.
  • Created several Business Reports and Dashboards for the High level Management.
  • Modified the existing Business Layer to add more Dimensions in the Siebel Admin Tool.
  • Created new Calculated Measures and set appropriate Aggregation properties for the Dimension members.
  • Involved in managing appropriate security privileges on subject areas and dashboards according to business requirements.
  • Developed custom reports/Ad-hoc queries using Seibel Answers and assigned them to application specific dashboards
  • Developed Interactive Dashboards with Prompts and added Forecast Measures which generate forecast values.
  • Created Ibots on certain fields to Alert high Level Management Users.
  • Created Groups and added Users in accordance with the existing User Privileges.

Environment: Siebel Analytics 7.5, Informatica 7.1, SQL Server 2000, Linux.

Confidential, ST. Louis, MO

BI Developer

Responsibilities:

  • Gathered requirements from the user group and translated the requirements into detail technical and functional specifications.
  • Programmed PL/SQL code to implement business rules through triggers, procedures, functions, and packages using SQL*Plus and other editors.
  • Developed Stock status report Weekly, Monthly, and periodically using Discoverer.
  • Involved in performance tuning and monitoring, tuned SQL queries using Code Statistics and Explain plan features, and analyzing number of hits for an optimal execution plan.
  • Wrote Ad-Hoc SQL queries for data analysis.
  • Developed new screens and reports using Oracle Forms/Reports 6i and PL/SQL.
  • Extensively involved in re-formatting and re-programming the Oracle Forms/Reports in correspond to the updates on the client requirements.
  • Preparedand scheduled Reports for printing. Sent Reports with emails.
  • Extensively involved in coding of the triggers, and making use of built in packages and libraries while developing Forms.
  • Created application mode End User Layer, business areas, and discoverer workbooks.
  • Used TOAD to run SQL and PL/SQL code to manipulate the databases in all environments.
  • Generated DDL Queries for creation of new database objects like tables, views, sequences, functions, synonyms, indexes, triggers, packages, stored procedures, roles and granting privileges.
  • Involved in testing the Data Integrity of the new tables.
  • Created UNIX Shell scripts for automating the execution process.
  • Worked closely with application developer to register, edit forms and reports for production.

Environment: Oracle 9i, SQL, PLSQL, SQL Server, XSLT, Xpath, Java, JSP, Web logic, Shell Scripts, CRONTAB, TOAD, CVS.

We'd love your feedback!