Lead Consultant Resume
New York City, NY
TECHNICAL SKILLS:
BigData/Hadoop Ecosystem: HDFS, YARN, Map Reduce, HIVE, PIG, HBase, Sqoop, ZooKeeper
Middleware: TIBCO ActiveMatrix Business Works (5.x), Designer (5.x), Patterns (4.x), Business Events (3.x, 5.x)
Middleware Tools: TIBCO Administrator (5.x), Hawk (4.x), Active Database adapter (5.x), File adapter (5.x), IBM me (6.x), iProcess Workspace (Browser) (11.x)
EAI Message ware: Enterprise Messaging Service (6.x), Rendezvous (8.x), UDDI, WSDL, SOAP, WSIL, MOM, MTOM
Programming Languages: C, C++, Java, C# .NET, ADO.NET, ASP.NET, Python
Databases: Oracle 8i/9i/10g/11g, MS SQL Server 05/08, PL/SQL
Directory Service: LDAP, JNDI, MS Active Directory
Case Tools: Visio 10, iWork 09
Version Control System: Subversion (Tortoise SVN), Visual Source Safe (VSS), CVS, ClearCase, Perforce, PVCS, XML Canon, GitHub, BitBucket
Web Server: Apache Tomcat
Web language: XML schema, DTD, XSLT, XPATH, PHP, HTML, JSP
Testing Tool: SoapUI Pro 4.0.1, CA (ITKO) Lisa Virtualize
Operating Systems: Windows 2000/NT/XP/7, Linux (Fedora, Redhat), Mac OS (10.x)
PROFESSIONAL EXPERIENCE:
Confidential, New York City, NY
Lead Consultant
Responsibilities:
- Leading and creating synergies wifin multi - faceted team for development of NBA system
- Extensively worked on HIVE and IMPALA queries and optimization
- Wrote Spark/Scala code for in-memory computation of scores/ranks based on Random-Forest algorithm
- Coordinated wif scheduling team for jobs/batch scheduling using Autosys and IBM’s Tivoli Workload Scheduler
- Worked wif Relational Database team to identify and load relevant tables into HDFS
- Engaged wif Hadoop COE team for infrastructure setting-up of application, cluster, logs and optimization tasks
- Wrote shell script for copying data into HDFS, running Hive and impala queries and submitting spark jobs
Environment: Hive, Impala, Spark, Scala, Autosys, TWS, JIRA
Confidential, Jersey City, NJ
Lead Consultant
Responsibilities:
- Ensured teh Hadoop Cluster is setup and running by configuring all required parameters using Cloudera Distribution Hadoop (CDH 5)
- Loaded data in to Apache Hive and Impala from heterogeneous databases - DB2, Oracle, and SQL Server using Apache SQOOP
- Managing data and tables using Hue
- Visualization of Data using d3.js, a JavaScript library
- Developed a Password Encryption API to be used in SQOOP commands to avoid misuse of Passwords
- Developed Java Map Reduce Regular Expression Program to search free text data loaded from DB2 application into Hive/Impala
- Developed Java Regex wrapper to extract required fields from free text data
- Written Hive Scripts to create Final Transformed Tables in Impala database those are exposed to front-end d3.js using Cloudera’s ODBC driver
- Performed Tuning on Hive Queries
- Created an Incremental Data Strategy Framework to perform daily refresh of teh data in Hive / Impala from DB2, Oracle and SQL-Server databases
- Developed HBase tables using HBaseStorage Handlers in Hive. dis is done to TEMPeffectively utilize teh transactional versioning capability in HBase. These HBase tables are exposed to front-end d3.js
- Written Hive Queries using advanced aggregate functions like Collect Set to transform Rows into Columns and load them into HBase tables
- Written Oozie workflow which runs daily to perform data load and create final transformed tables
- Currently working on ETL based application, which extracts data from various feeds received, store it in Oracle and generate alerts using Actimize toolset
- Worked on data development/modeling, integration, SQL, and ETL while working for BDS initiative of Regulation W track of trade & surveillance unit
Environment: Sqoop, Hive, JavaScript and d3.js
Confidential, Atlanta, GA
IT Consultant
Responsibilities:
- Engineered, developed and maintained complex interfaces and code base using Business Events and Business Works
- Worked extensively on varied BE Concepts, Rules, Rule-functions, engine performance tuning, EMS destinations and servers
- Worked on BW COBOL Copybook Plug-in, web-services, MQ series connection, iAPI based logging and exception handling, and automated deployments
- Convinced and lead teh team to use Hadoop based solutions to cater to, large amount of data being received and processing needs. Design, Development and Deployment of teh Big Data Processing Applications capable of handling 25TB of data daily on 50 node cloudera cluster
- Used Hive to refine data for further analysis and for transformation of files from different analytical formats to text formats. Wrote PIG Latin scripts for data processing
- Performed data migration from RDBMS to HDFS using SQOOP, analyzed data using HQL, PIG and MapReduce and put it back in RDBMS
- Coordinated wif various teams for technical requirements, implementations. Subsequently, testing and deploying multiple applications across environments
- Lead and assisted client teams to debug and troubleshoot current applications
- Created and maintained Technical documentation for launching HADOOP Jobs and for executing Hive/Pig queries and Scripts
Environment: HADOOP 1.2.1, PIG 0.13.0, HIVE 0.13.1, SQOOP 1.4.6, HBASE 1.0.1, CDH 5.x, TIBCO BE 5.1.2, BW 5.11, TRA 5.7, EMS 7.0, Admin 5.7, Gems 3.4, Oracle 11g, GlodenGate, Linux, Sun Solaris
Confidential, Atlanta, GA
IT Consultant
Responsibilities:
- Gatheird requirements, worked wif BA and developed new functionality using Hadoop for buyer and seller side data
- Coordinated wif database, AS400, informatica, deployment and testing teams for successful implementation of project
- Responsible for adding and removing cluster nodes, cluster monitoring and troubleshooting, manage and review data backups, performance tuning, manage and review Hadoop log files
- Handled importing of structured and semi-structured data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS, exported data to RDBMS, onto ESB using Avro and HP Vertical
- Published data out from files and database using EMS messaging and BW orchestration capabilities
- Addressed and troubleshoot issues faced by internal teams pertaining to individual auction records
- Researched usability of REST service using HBASE and ActiveMatrix BW plug-in for REST and JSON per our requirement
Environment: HADOOP 1.2.1, PIG 0.13.0, HIVE 0.13.1, SQOOP 1.4.6, AVRO 1.7.6, CDH 4.x, TIBCO BW 5.10, EMS 7.0, Admin 5.7, Oracle 11g, Informatica, Jenkins, AS/400
Confidential, San Diego, CA
Hadoop Consultant
Responsibilities:
- Worked wif technology and business groups at Snaptracs to recommended suitable Hadoop technology stack for migration strategy
- Assessed current planned phases for Hadoop/Big Data implementation, current architecture, roadmap and technology fitment, considering data growth
- Infrastructure setup, capacity planning and administration of 38 node Hortonworks HDP 1.3 data cluster
- Troubleshoot and addressed challenges of production cutover. Provided production support
Environment: HADOOP 1.2.1, PIG 0.11.1, HIVE 0.13.1, JAVA 6, Oracle 11g, Atlassian, Splunk
Confidential, Las Vegas, NV
HADOOP Consultant
Responsibilities:
- Analyzed and implemented best practices on existing code, along wif logging and exception handling, wif teh aim of optimizing operational efficiency for data capturing
- Researched various Hadoop technology stack for data access, data storage, data serialization, data intelligence, data integration, management, monitoring and orchestration
- Processed streaming inputs / importing data in to HDFS from Heterogeneous technologies
- Setup and capacity planning of CDH4.3 Hadoop cluster
- Helped release management wif installations, configurations, and deployments of teh improved and enhanced solutions
- Documented detail operational guides explaining designs, interfaces, process flows, methodologies, architecture diagrams and deployment activities
Environment: CDH 4.3, HADOOP 0.23.6, PIG 0.11.0, HIVE 0.10.0, JAVA 6, CDH 4.x, TIBCO BW 5.9, EMS 6.1, Admin 5.7
Confidential, Richfield, MN
TIBCO Lead
Responsibilities:
- Created synergies wifin cross-cultural, cross-location and cross time-zone teams (positioned in China & India)
- Coordinated and mediated wif client and offshore team for issues and timely deliverables
- Involved wif front-end, back-end, security, infrastructure and TIBCO PSG teams for design and engineering.
- Presided over and assessed technical implementation of schema designs, BW & BE implementations, CLE alerts, Hawk rule constructs, security reviews, and deployments
- Created web-services using SOAP over HTTPS to expose Oracle DB organized data; created web-service client to retrieve and subsequently expose data from Siebel database
- Created simple events and rules in Business Events to validate incoming requests from client application
- Consulted TIBCO Professional Services Group (PSG), putting into TEMPeffect Tibco COE standards and best practices, for PROD deployments
- Learned CA (ITKO) Lisa service virtualization (Virtual Service Environment/Image tools), testing and automation during rigorous session by CA technologies
- Researched tools- Chef and Vagrant, as a part of automation project, which would result in continuous integration and will equip teh IT division wif on-demand TIBCO virtual clients
Environment: TIBCO BW 5.9, BE 5.0, TRA 5.7, EMS 6.0, Oracle 10g, MakeDoc for TIBCO, CA (ITKO) Lisa Virtualize, Jenkins, GitHub, Test Maker (PushToTest), HP Systinet, Layer7, MS InfoPath 10, Rally Software
Confidential, Menomonee Falls, WI
TIBCO Lead
Responsibilities:
- Coordinated and guided technical design sessions wif Kohl’s and Responsys
- Configured JMS connections to connect wif and retrieve through IBM WebSphere MQ MOM
- Integrated s & keys to establish connection wif outside web-service while encrypting & decrypting messages over secure network
- Created web-service client to integrate wif external web-service using SOAP over HTTPS
- Implemented session management, message prioritization, logging and tracking framework to handle messages
- Worked on performance testing, load balancing and stress testing principles of deployment to handle teh upcoming holiday season transactions
Environment: TIBCO BW 5.6, TRA 5.6, MQ Series 7.0, Admin 5.6, SoapUI 4.5
Confidential, Centennial, CO
TIBCO Developer
Responsibilities:
- Engineered broad spectrum of TIBCO BW processes, synchronizing wif EMS, Business Events, General Interface forms, iProcess integrator and MS SQL Server
- Designed, configured and monitored EMS messaging server and tools
- Designed and TEMPeffectuated logging & exception handling using TIBCO CLE
- Expanded JMS based interface to TIBCO iProcess Workspace (Browser)
- Created web-services using WSDL and SOAP call to interact wif data analysts using TIBCO GI
- Performed QA activities in staging environment
- Monitored and administered technical cum business processes using TIBCO Administrator and hawk tools to provide production support
- Implemented matching engine to streamline teh duplicate data using TIBCO patterns
- Generated unit tests for each operation using SoapUI Pro
- Wrote several rule sets, rules, events and concepts to validate input files while interfacing between BE and BW
- Migrated and deployed teh project from TIBCO Business Events 3.0 to 5.0
Environment: TIBCO BW 5.8, TRA 5.6, EMS 5.1, Admin 5.6, BE 3.0, 5.0, GI 3.9, Business Studio (iProcess) 3.4, iProcess workspace browser 11.3, Patterns 4.5, MS SQL 08, SoapUI Pro 4.0, Rally Software
Confidential
Software Engineer
Responsibilities:
- Tata Consultancy Services Limited, one of India's most valuable companies, provides information technology (IT) services, business solutions, and outsourcing services worldwide
- Completed rigorous and project under Travel, Transportation and Hospitality division: designing and implementing shipping and airline industry software operations (ticketing, logistics, management) on .NET and subsequently wif TIBCO tools, Strategized teh team to clinch letter of appreciation from Delivery Head-TCS North for best and timely execution
