NAME NAMIT EXECUTIVE SUMMARY EXPERTISE DELIVERIES Around 10+ years of experience on Big Data Technologies such as Hadoop and MongoDB, Java, Python, Big Data Analytics, System Integration and Consulting experience. SME in Data Science and Big Data Analytics. Expertise in a whole gamut of technologies ranging from Java, Python, Data warehousing, ETL, reporting tools to the full Hadoop Ecosystem including Pig, Hive, Sqoop, Flume, Oozie, Zookeeper etc. Domain Banking, Retail and Media Technology Big Data: Hadoop Ecosystem Database NoSQL: MongoDB, Cassandra; RDBMS: SQL Server Programming : Java, Python ETL Pervasive Data Integrator Reporting Tableau, SAP Business Objects OS Windows, Linux, Unix Cisco HP SkillWise PeopleClick EY ->MongoDB ->Hadoop and MongoDB ->Hadoop ->Hadoop ->Hadoop and MongoDB HIGHLIGHTS Is a Cloudera certified Hadoop Developer. Is a Certified MongoDB Developer and a Certified MongoDB DBA Has been into training Big Data Hadoop and MongoDB from more than a year now Following developments happening in the Big Data Arena since its inception into the market Has an expertise in Big Data Analytics -- Cloudera Hadoop and NoSQL databases
Is experienced in building end to end solutions in Data Warehousing and Business Intelligence Is experienced in building end to end Java applications. Is proficient in SQL programming and warehousing concepts Experienced in proposing and implementing technology solutions in the BI space Adept in Dashboard making and Reporting involving Data Analysis, ETL and creating visuals and reports using the data provided by the client Has good communication, problem solving and Interpersonal skill. Project Details No. 1 : Member Loyalty Managment : Wallmart Hadoop Framework : HDFS, MapReduce, HIVE, HBase, Sqoop, PIG, Scripting : Python, UNIX OS : Ubuntu 11.10 ETL Tool : Pentaho Data Integration Maintaining the customer member details and rewards points transaction are very difficult in terms of storage and processing. Member loyalty management system is replacing the existing reward management system which is developed as a web service provider with the help of database shading. Aim of this system is to reduce the response time of web service. This system is designed with HBase storage handler and later planning to remove some BI reports generation using HIVE. and Technologies Used: Java 1.6, HDFS, MapReduce, HIVE, HBase, Sqoop,PIG, Python, Eclipse, Ubuntu 11.10 Roles and Responsibilities: * Worked on setting up Hadoop over multiple nodes and designed and developed Java mapreduce jobs *Worked on setting up Pig, Hive and Hbase on multiple nodes and developed using Pig, Hive and HBase. * Analyzed the Functional Specifications. *Develop basic design to solve the problem and reduce problem to many small mapreduce jobs. *Developed MapReduce application using Hadoop map reduce programming API and HBase APIs. *Storing and retrieved data using HQL in Hive. No. 2
: Prototype Track and Trace for Pharmaceuticals : GHX Hadoop Framework : HDFS, MapReduce, HIVE, HBase, Sqoop, PIG,Thrift Scripting : Python, UNIX OS : Ubuntu 11.10 Architected and prototyped Track and Trace for pharmaceuticals, using HDFS, HBase, Hive, XML, REST finegrain access control with certificates, with capacity of 1,000-10,000 transactions per second, with background processes to verify chain of custody and fraud prevention. Tasks accomplished: Roles and Responsibilities: *Designed Hadoop jobs to verify chain-of-custody and look for fraud indications; *Prepared multi-cluster test harness on EC2 to exercise the system for performance and failover No. 3 Prototype for Wipro Health care Division : Patient Medical Record Analysis Hadoop Framework : HDFS, MapReduce, HIVE, HBase, Sqoop, PIG Scripting : Python, UNIX OS : Ubuntu 11.10 Data analysis performed on patient medical record data sets and the result dataset is materialized into structures that can be rapidly accessed and analyzed via off-the-shelf tools or custom-built applications. i.e., this application generates data back into HBase for other processing needs. Transfer drugs purchase transaction details from legacy systems to HDFS Identifying the target patient Region based consumed drugs Roles and Responsibilities: * Analyzed the Functional Specifications. *Develop basic design to solve the problem and reduce problem to many small mapreduce jobs. *Developed MapReduce application using Hadoop map reduce programming API and HBase APIs. *Storing and retrieved data using HQL in Hive. and Technologies Used: Java 1.6, HDFS, MapReduce, HIVE, HBase, PIG, Python, OraOop, Eclipse, Ubuntu 11.10 No. 4 : Telecom Order Management : British Telecom : Java, Apex, Visualforce : CastIron, Informatica Cloud
TOM (Telecom Order Mgmt) is a complete tool to manage Service Order Fulfillment for Telcos & ISPs with features like: - -Quoting -Order Configuration -WorkOrder & WorkFlow -Task Sequencing, Escalation -Premise & Cirtuit tracking -Manage Provisioning Role: Technical Developer Technology Used: Salesforce Configuration, Apex Programming, Visualforce, DataLoader, WebServices Construction and Configuration: Construction of Custom Objects -Custom Objects : Service Orders, Install Bases, Service Plans, WorkOrders,Services,Equipment, Charges, Carriers, UsageType No. 5 Duration : Implementation of Sales Automation : Novartis : Nov'2009 to Dec'2010 : Java, Apex, Visualforce : CastIron, Informatica Cloud Mainly this project involves the integration of the company s legacy application with salesforce Accounts, Contacts and Cases. This project involves the real time integration of the client s database with the Salesforce. Role: Technical Developer Technology Used: Salesforce.com Configuration, Validations, Workflows, Apex Programming, Visualforce, S- Control Force.com eclipse, Apex Data loader, Mobile Configuration Responsibilities: Application configuration, Data Migration, programming using apex script, Visual Force screen design & programming, Preparation and Execution of Unit and System Test cases Construction and Configuration: Configuration and Customization of the following modules: - Sales force Automation - Opportunity Management - Salesforce.com Outlook Integration - Analytics.
Design and Documentation: Prepared the following documents: - Migration Document - Technical Design Document - Mobile Configuration Document - Outlook Integration Document No. 6 : Design and Development of Knowledge Base and Call Center Integration : Reebok : Java, Apex, Visualforce : CastIron, Informatica Cloud Main features of this project involve the implementation of Knowledge Base, Activities, Case and Order Management. One of the key aspects of this project is the Call Centre Integration and Data Migration which facilitates the customers for the Order creation and Case Logging. Role: Technical Developer Technology Used: Salesforce Configuration, Apex Programming, Knowledge Base, Email to Case, Outlook Integration, Apex Triggers, Call Centre Integration and Data Migration. Responsibilities: Application configuration, programming using apex script, Configuring Email to case, Integrating Salesforce with Outlook, Functionalities of Call centre Integration. Construction and Configuration: Construction of Custom Objects (Order) - Call Centre & Outlook Integration. - Configuration and Customization of the Email to case according to the requirements. No. 7 Host OS : Ciscoview EMS/NMS Application Develoment : Core Java, C, ADA : Cisco : ClearCase : Solaris CiscoView is a graphical SNMP-based device management tool that provides powerful real-time views of networked Cisco Systems devices. These views deliver a continuously updated physical picture of device configuration and performance conditions, with simultaneous views available for multiple device sessions.
Roles & Responsibilities: Involved in writing Functional Specification, Design Specification, Coding, and Unit Testing of the features Development of EMS/NMS application development using core java, c and Cisco proprietary language ADA for SIP/SPA modules Fixing issues found in Unit Testing and customer found issues Resolving external and internal queries Doing code review for the SIP/SPA EMS/NMS application