Programmer/analyst Big Data Resume
Dallas, TX
SUMMARY
- 12+ years of professional IT experience in programming, designing and application support of Enterprise Applications on Unix and windows platform which includes experience in Big Data ecosystem.
- Good working experience in the business sector of Telecommunications, banking and Financial electronic payment solution domain.
- Worked in C/C++,Java, Unix, SQL, No - SQL, Shell scripting based applications and well versed with RDBMS like MS-SQL server, Oracle, MariaDB, MongoDB, HBASE, Cassandra, and OS concepts.
- Good experience in Big Data ecosystem components like Hadoop, MapReduce, HDFS, HBase, Hive, Sqoop, Pig, Zookeeper and Flume.
- Good Knowledge on Hadoop Cluster architecture and Horton Works & Cloudera distribution.
- Basic understanding and exposure on SPARK, Streaming, Cloud & DevOps Technologies.
- Experience of working with different application monitoring and job scheduling tools such as Introscope, Control-M, HFrunmonitor, & AutoSys.
- Involved in all phases of Software Development Life Cycle including design, data modeling, software implementation, testing, support and Maintenance.
- Worked on projects using different configuration management as PBN (Project by Net), StarTeam, GIT,SVN and Ticket tracking system such as Jira, Bugzilla and Putty, AOTS BMC Remedy for Application support.
- Experience in User/Customer interactions, requirements gathering from business users, design, development of banking and credit applications and products.
- An effective communicator with excellent client relationship and management skills.
- Have experience working in SCRUM-AJILE methodology in projects.
- Have Worked in Onsite-Offshore model.
- Experience working in the ITIL Process: Incident Management, Problem Management, Change Management, Event Management, SLA and Release & Configuration Management.
TECHNICAL SKILLS
Operating Systems: Windows (95/98/XP), Unix, Linux
SCRIPTS: Shell scripting, Html, XML, Java script, python
LANGUAGES: C,C++,STL,Java,J2EE,SQL,Scala
DATABASES: Oracle 9i, MS Access, MySQL, titan,MongoDB,Cassandra
Middleware: MQ series, Connect Direct, Kafka
Tools: & Utilities: MS Visual Studio 6.0, gcc, g++, gdb,UML,CVS,Jira,Eclipse,StarTeam BMC Remedy- AOTS-TM and CM, Toad, Putty, Wily Introscope Agent, ALDB, Patrol, One Tool, Netcool, ESET, Apache Webserver,IBM WebSphere Application Server 6.0,Introscope,Jenkins,chef,puppet
Web Related: XML, Web Services, cloud (AIC), AWS
Big Data Ecosystem: HDFS HBase Hadoop MapReduce Zookeeper Hive Pig Sqoop Flume Oozie Cassandra Spark
PROFESSIONAL EXPERIENCE
Confidential, Dallas, TX
Programmer/Analyst Big Data
Responsibilities:
- Refining metadata movement using shell scripts to support profiling of various databases.
- Responsible for DPLR GID profiling jobs against data sources within Confidential &T family to understand Monetization or rationalization targets.
- Maintain subscription/publication of different feeds, topics and running the extraction and publishing of files to Data Router.
- Developing Scripts and Batch Jobs to schedule various Hadoop Programs.
- Created Hive table, Hive queries for data analysis to meet the business requirements.
- Involved in installing Hadoop Ecosystem components in UAT/Prod.
- Involved in monitoring the Hadoop clusters and health checks in Ambari Horton works.
- Involved in initial data load and delta load management using sql scripts.
- Worked on Shell scripts for Profiling - Collects information about database schemas, tables and columns.
- Responsible for data Ingestion to Hadoop system from various sources using Sqoop.
- Responsible for monitoring data loads in application using Introscope.
- Log analysis using elastic search, logstash and kibana.
- Part of team transitioning existing complex infrastructure to Confidential &T(cloud).
- Assisted in performance of Gap Analysis related to Data Models and data extracted in ETL’s and map it to hadoop HDFS.
- Worked in implementing Big Data pipelines using sqoop,flume,Java,Mongodb.
- Created UDFs in Java to customize the data in pig.
- Worked on data analytics using Spark SQL (pyspark) .
- Used Spark-Streaming APIs to perform necessary transformations and actions.
- Running Cql queries, creating tables and loading the data to Cassandra.
- Involved in installing security certificates in QA, UAT and Prod servers.
- Providing support for User acceptance test and production implementation.
Environment: Hadoop, Hive, Sqoop, Yarn, Pig, Unix, Shell Scripts, Spark, Python, SparkSQL, Dataframe, SQL, Java,JSON,NOSQL,Python,Oracle,MariaDB,MongoDB,Cassandra,Hortonworks,cloud,SCALA Kafka,Linux,Introscope,Kabana,Elastic Search,Jenkins,Tomcat,Jboss,EMR,Titan,Graphdb,Groovy,HBASE
Confidential, St Louis, TX
Programmer/Analyst Big Data
Responsibilities:
- Reviewing, Designing and incorporating the business changes.
- Analyzing the business requests with client and doing impact analysis.
- Migrated the data from Oracle, MySQL in to HDFS using Sqoop and imported various formats of flat files into HDFS.
- Loading logs and event data to Hadoop using Flume.
- Preprocessing of data using Pig/Hive to filter data for client devices.
- Managed and reviewed Hadoop log files.
- Involved in alerting and automation of daemon process through EDART setup
- Document as-is state of environment, gap analysis, come up with options, recommendations
- Resolution of user issues and bug fixing and Maintenance of automated tooling code.
- Participated in server migration on Confidential &T cloud.
- Involved in designing of dashboard of AIC ( Confidential &T Integrated Cloud) and coordinating with different stakeholders.
- Was Involved in upgrading MongoDB(NoSQL) to latest version.
- Wrote Java functions and shell scripts to pull related data from different interfaces db.
- Keep all stakeholders informed of project status and reports.
- Participated in meetings with different business groups to identify solutions and risks.
Environment: Linux, Unix Shell script, Perl, Java, JSON,REST,NoSql,python, Oracle, Mongo DB, GIT, SVN, Jira, Bugzilla, Introscope, Hfrunmonitor, Splunk, Hibernate, Hadoop,Hive,PIG,HBASE.
Confidential, NJ
Technical Analyst
Responsibilities:
- User requirement Analysis, Development, Testing and Documentation
- Analyzing the business requests with client and doing impact analysis.
- Reviewing, Designing and incorporating the business changes .
- Troubleshoot and resolved application issues escalated by users.
- Worked on all major enhancements,bugfixing and major releases.
- Responsible for implementing all assigned Change Requests.
- Participating in application recovery efforts for outages.
- Test deployments and Patch installation in Test,UAT and production machines.
- Involved in Production Application support in Unix.
- Documented project details, technical specifications and troubleshoot guide
- Responsible for handling meetings with different teams to identify new change requests and Releases.
- Participated in meetings with different business groups to identify solutions and risks.
- Carrying out the Deployments & Sanity testing of the applications on every new release.
Environment: Linux, Unix Shell script, Solaris,C/C++,Oracle,BMC Remedy,Introscope
Confidential, NJ
Programmer/Analyst
Responsibilities:
- User requirement Analysis, Design, Development, Testing and Documentation.
- Finalization of the Project Scope Document and Deliverables.
- Performing the maintenance, upgrades and troubleshooting for technical issues .
- Monitoring the overall progress and weekly reporting to Executive Steering Committee.
- Part of team to upgrade from JD Edwards World System to JD Edwards Enterprise one 8.12/8.98.3 with creating new Applications, Reports & Business Functions. managing and supporting with production support and new developments
- Weekly reviews and tracking the monthly SLAs etc.
- Train & Support IT Staff on issues related to optimal techniques of JDE World Software.
- Provide user training and support to of proper use of financial applications.
- Troubleshoot & resolve application related glitches.
- Document program changes and process flows.
- Tracking financial reports and presenting the data to financial teams.
Environment: JDEdwards,Citrix server, Linux, Unix Shell script, Solaris,C/C++/Java,Oracle,BMC Remedy,Introscope
Confidential
Backend C++ Programmer
Responsibilities:
- Developed Served Side C++ Code for implementing various Commands.
- Used embedded SQL for connecting with the Non Stop SQL-MX database.
- Worked with Stored Procedures, Functions, Cursors and Triggers on Server Side.
- Used STL and MULTITHREADING in application.
- Extensive Usage of XML to Transfer Data between Client and Server.
- Helped in refactoring the client framework through encapsulation and inheritance.
- Developed Client side modules using OOPS principles.
- Followed Agile Methodology towards Software development.
- Wrote queries to get data from database.
- Involved in testing and monitoring of queries.
- Written Test Cases in QTP for Automation testing and Manual Testing.
- Involved in the Regression testing of the Application.
Environment: C++, STL,Embedded SQL, C#, XML, HP Non Stop SQL(SQL/MX), Visual Studio 5.0,Mercury Quality Centre,HP-UX,Tandem
Confidential, KS
Programmer/Analyst
Responsibilities:
- Developed shell programs and configuration files in UNIX to handle the business logic.
- Involved in the enhancement of Dfile library functions using C/C++.
- Worked with Pro*C programs for handling the subscribers information.
- Used SED/AWK in various shell scripts.
- Worked with stored procedures, functions, cursors to get data and to handle the business logic.
- Involved in Processing and monitoring of various jobs in UNIX.
- Involved in testing and monitoring of queries for TERADATA.
- Wrote queries to get data from Oracle database.
- Involved in the analysis of Data Mapping between the existing source system (P2K) and new source systems.
- Created Shell scripts for handling the data process in new systems to generate various reports.
- Involved in analysis, designing, coding and unit testing of the application.
Environment: C/C++, Pro*C, SQL, Oracle 9i, Sun Solris,AIX
Confidential
Backend Software Engineer
Responsibilities:
- Successfully implemented the direct to bank (D2B) facility to coordinate with National Banks of Philippines.
- Involved in implementing the automation of adjustment request and reinstate process.
- Successfully implemented the OFAC Enhancements using C++.
- Have excellent interactions with Client, Business Analyst, Development and Testing teams.
- Involved in design and development of the Enhancements requested by the client.
- Involved in fixing the tracker defects and bugs found in the application using C++/Java.
- Involved in unit testing and integration testing of the developed module.
- Wrote Unit test cases for testing.
- Used StarTeam as a configuration management tool for managing the module developments.
- Used Project management tool Jira for managing the project.
- Involved in maintaining the quality processes in the project.
Environment: C/C++, Java, COM, MS Visual Studio 6.0, StarTeam, Jira, Borland C++ Compiler5.02, TDB(debugger)
Confidential
Software Engineer
Responsibilities:
- Successfully implemented the settlement process in Multithreaded application.
- Involved in application development using C++ to handle the financial transactions.
- Developing libraries in C, which were used for interaction with Pin Pad, connected to the machine.
- Involved in unit testing and integration testing of the project.
- Involved in writing the various test cases.
- Involved in writing the batch files to build the application.
Environment: C/C++, COM, MS Visual Studio 6.0, StarTeam, Jira