List of Publications by Claudio Gentile



Similar documents
Simple and efficient online algorithms for real world applications

DUOL: A Double Updating Approach for Online Learning

Karthik Sridharan. 424 Gates Hall Ithaca, sridharan/ Contact Information

Teaching in School of Electronic, Information and Electrical Engineering

Foundations of Machine Learning On-Line Learning. Mehryar Mohri Courant Institute and Google Research

Table 1: Summary of the settings and parameters employed by the additive PA algorithm for classification, regression, and uniclass.

An Introduction to Data Mining

Support Vector Machines with Clustering for Training with Very Large Datasets

Spam detection with data mining method:

CAS-ICT at TREC 2005 SPAM Track: Using Non-Textual Information to Improve Spam Filtering Performance

RESEARCH INTERESTS EDUCATION HONORS POSITIONS HELD

CS 2750 Machine Learning. Lecture 1. Machine Learning. CS 2750 Machine Learning.

Search Taxonomy. Web Search. Search Engine Optimization. Information Retrieval

Introducing diversity among the models of multi-label classification ensemble

College information system research based on data mining

Online Semi-Supervised Learning

Prediction of Stock Performance Using Analytical Techniques

Experiments in Web Page Classification for Semantic Web

The Enron Corpus: A New Dataset for Classification Research

Victoria Kostina Curriculum Vitae - September 6, 2015 Page 1 of 5. Victoria Kostina

Online Feature Selection for Mining Big Data

Steven C.H. Hoi. Methods and Applications. School of Computer Engineering Nanyang Technological University Singapore 4 May, 2013

9700 South Cass Avenue, Lemont, IL URL: fulin

International Journal of World Research, Vol: I Issue XIII, December 2008, Print ISSN: X DATA MINING TECHNIQUES AND STOCK MARKET

Carnegie Mellon University, Pittsburgh, PA Postdoctoral Fellow in the Computer Science Department. Supervisor: Avrim Blum

Intrusion Detection via Machine Learning for SCADA System Protection

AUTO CLAIM FRAUD DETECTION USING MULTI CLASSIFIER SYSTEM

How To Solve The Kd Cup 2010 Challenge

HYBRID PROBABILITY BASED ENSEMBLES FOR BANKRUPTCY PREDICTION

A Novel Feature Selection Method Based on an Integrated Data Envelopment Analysis and Entropy Mode

Semi-Supervised Support Vector Machines and Application to Spam Filtering

Machine Learning CS Lecture 01. Razvan C. Bunescu School of Electrical Engineering and Computer Science

Machine Learning and Statistics: What s the Connection?

BOOSTING - A METHOD FOR IMPROVING THE ACCURACY OF PREDICTIVE MODEL

Scalable Developments for Big Data Analytics in Remote Sensing

International Journal of Computer Science Trends and Technology (IJCST) Volume 2 Issue 3, May-Jun 2014

DATA MINING TECHNIQUES AND APPLICATIONS

Active Learning SVM for Blogs recommendation

Information Management course

WE DEFINE spam as an message that is unwanted basically

Artificial Neural Networks and Support Vector Machines. CS 486/686: Introduction to Artificial Intelligence

Feature Selection using Integer and Binary coded Genetic Algorithm to improve the performance of SVM Classifier

Regret Minimization for Reserve Prices in Second-Price Auctions

Antonino Freno. Curriculum Vitae. Phone (office): Office: +33 (0)

Knowledge Discovery from Data Bases Proposal for a MAP-I UC

A Performance Comparison of Five Algorithms for Graph Isomorphism

Data Mining: A Preprocessing Engine

Bayesian Online Learning for Multi-label and Multi-variate Performance Measures

Big Data in Web Age - 互 联 网 时 代 的 大 数 据

Interactive Machine Learning. Maria-Florina Balcan

Margareta Ackerman. Assistant Professor. Research Interests. Education. Academic Positions. Grants and Fellowships

Explanation-Oriented Association Mining Using a Combination of Unsupervised and Supervised Learning Algorithms

A MapReduce based distributed SVM algorithm for binary classification

Monotone multi-armed bandit allocations

Analysis of Internet Topologies

Data Quality Mining: Employing Classifiers for Assuring consistent Datasets

Active Learning in the Drug Discovery Process

Active Learning with Boosting for Spam Detection

Machine Learning.

INTRODUCTION TO MACHINE LEARNING 3RD EDITION

ANALYSIS OF FEATURE SELECTION WITH CLASSFICATION: BREAST CANCER DATASETS

Ensemble Data Mining Methods

KNOWLEDGE-BASED IN MEDICAL DECISION SUPPORT SYSTEM BASED ON SUBJECTIVE INTELLIGENCE

Linear smoother. ŷ = S y. where s ij = s ij (x) e.g. s ij = diag(l i (x)) To go the other way, you need to diagonalize S

A Fast Partial Memory Approach to Incremental Learning through an Advanced Data Storage Framework

Network Intrusion Detection Using a HNB Binary Classifier

1 Introduction. 2 Prediction with Expert Advice. Online Learning Lecture 09

Predicting Student Performance by Using Data Mining Methods for Classification

Performance Analysis of Data Mining Techniques for Improving the Accuracy of Wind Power Forecast Combination

Another Look at Sensitivity of Bayesian Networks to Imprecise Probabilities

New York State Trends in Student Financial Aid and Cost of Attendance

Machine Learning Final Project Spam Filtering

June Zhang (Zhong-Ju Zhang)

Top Top 10 Algorithms in Data Mining

A Feature Selection Methodology for Steganalysis

Steven C.H. Hoi. School of Computer Engineering Nanyang Technological University Singapore

Un point de vue bayésien pour des algorithmes de bandit plus performants

Ensemble Methods. Knowledge Discovery and Data Mining 2 (VU) ( ) Roman Kern. KTI, TU Graz

Scientific research competitiveness of world universities in computer science

Comparing the Results of Support Vector Machines with Traditional Data Mining Algorithms

Hyperspectral images retrieval with Support Vector Machines (SVM)

Online Cost-Sensitive Learning for Efficient Interactive Classification

Knowledge Discovery from patents using KMX Text Analytics

The Advantages and Disadvantages of Online Linear Optimization

Transcription:

List of Publications by Claudio Gentile Claudio Gentile DiSTA, University of Insubria, Italy claudio.gentile@uninsubria.it November 6, 2013 Abstract Contains the list of publications by Claudio Gentile, in reverse chronological order. Workshops are excluded from this list. References [1] N. Cesa-Bianchi, C. Gentile, F. Vitale, G. Zappella (2013). Random spanning trees and the prediction of weighted graphs. JOURNAL OF MA- CHINE LEARNING RESEARCH, vol. 14, p. 1251-1284, ISSN: 1533-7928. [2] N. Alon, N. Cesa-Bianchi, C. Gentile, Y. Mansour (2013). From Bandits to Experts: A Tale of Domination and Independence. In Proc. of the 27th conference on Neural Information processing Systems (NIPS 2013). MIT PRESS, 2013. [3] N. Cesa-Bianchi, C. Gentile, G. Zappella (2013). A gang of Bandits. In Proc. of the 27th conference on Neural Information processing Systems (NIPS 2013). MIT PRESS, 2013 [4] C. Gentile, M. Herbster, S. Pasteris (2013). Online Similarity Prediction of Networked Data from Known and Unknown Graphs. In Conference on Learning Theory. vol. 30, p. 662-695, JMLR Workshop and Conference Proceedings, MIT Press, 2013. [5] E. Gofer, N. Cesa-Bianchi, C. Gentile, Y. Mansour (2013). Regret Minimization for Branching Experts. In Conference on Learning Theory. vol. 30, p. 618-638, JMLR Workshop and Conference Proceedings, MIT Press, 2013. [6] Dekel O, Gentile C, Sridharan K (2012). Selective sampling and active learning from single and multiple teachers. JOURNAL OF MACHINE LEARNING RESEARCH, vol. 13, p. 2655-2697, ISSN: 1532-4435. 1

[7] Cesa-Bianchi N, Gentile C, Vitale F, Zappella G (2012). A linear time active learning algorithm for link classification. In Advances in Neural Information Processing Systems 25, 2012. [8] Gentile C, Orabona F. (2012). On multilabel classification and ranking with partial feedback. In Advances in Neural Information Processing Systems 25, 2012. [9] Cesa-Bianchi N, Gentile C, Vitale F, Zappella G (2012). A Correlation clustering approach to link classification in signed networks. In Proceedings of the 25th Annual Conference on Learning Theory. Edinburgh, Scotland, June 25-27, 2012, vol. 23, p. 34.1-34.20, JMLR Workshop and Conference Proceedings. Vol 23: Colt 2012. [10] Orabona F, Cesa-Bianchi N, Gentile C (2012). Beyond logarithmic bounds in online learning. In Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics. La Palma, Canary Islands, April 21-23, 2012, vol. 22, p. 823-831, JMLR Workshop and Conference Proceedings Volume 22: AISTATS 2012. [11] Crammer K, Gentile C (2012). Multiclass classification with bandit feedback using adaptive regularization. MACHINE LEARNING, vol. 90, p., ISSN: 0885-6125, doi: 10.1007/s10994-012-5321-8. [12] Cesa-Bianchi N, Gentile C, Mansour Y (2012). Regret minimization for reserve prices in second-price auctions. In Proceedings of the 24th Annual ACM-SIAM Symposium on Discrete Algorithms (SODA 2013). New Orleans, Louisiana, USA, January 6-8, 2013, SIAM. [13] Cavallanti G., Cesa-Bianchi N, Gentile C. (2011). Learning Noisy Linear Classifiers via Adaptive and Selective Sampling. MACHINE LEARNING, vol. 83, p. 71-102, ISSN: 0885-6125. [14] Cesa-Bianchi N, Gentile C, Vitale F, Zappella G (2011). See the tree through the lines: the Shazoo algorithm. In Advances in Neural Information Processing Systems 24: 25th Annual Conference on Neural Information Processing Systems 2011. Granada (Spain), December 12-15, 2011, vol. 24, p. 1584-1592, New York:Curran Associates, Inc. [15] Crammer K, Gentile C (2011). Multiclass classification with bandit feedback using adaptive regularization. In Proceedings of the 28th International Conference on Machine Learning (ICML-11). Bellevue, Washington, USA, June 28th - July 2nd, 2011, p. 273-280, New York:ACM. [16] Cesa-Bianchi N, Gentile C, Vitale F (2011). Predicting the labels of an unknown graph via adaptive exploration. THEORETICAL COMPUTER SCIENCE, vol. 412, p. 1791-1804 (special issue on ALT 2009), ISSN: 0304-3975. 2

[17] Cavallanti G., Cesa-Bianchi N., Gentile C. (2010). Linear algorithms for online multitask classification. JOURNAL OF MACHINE LEARNING RE- SEARCH, vol. 11, p. 2901-2934, ISSN: 1532-4435. [18] Dekel O, Gentile C, Sridharan K (2010). Robust Selective Sampling from Single and Multiple Teachers. In Proceedings of the 23rd Conference on Learning Theory. Haifa, Israel, June 27th- 29th, 2010, p. 346-358, NEW YORK:Omnipress. [19] Cesa-Bianchi N, Gentile C, Vitale F, Zappella G (2010). Active learning on trees and graphs. In Proceedings of the 23rd Conference on Learning Theory. Haifa, Israel, June 27th- 29th, 2010, p. 320-332, NEW YORK:Omnipress. [20] Cesa-Bianchi N, Gentile C, Vitale F, Zappella G (2010). Random spanning trees and the prediction of weighted graphs. In Proceedings of the 27th International Conference on Machine Learning (ICML-10). Haifa, Israel, June 21st - 24th, 2010, p. 175-182, NEW YORK:Omnipress. [21] Cesa-Bianchi N, Gentile C, Vitale F (2009). Learning unknown graphs. In Algorithmic Learning Theory, 20th International Conference. Porto (Portugal), October 3-5, 2009, p. 110-125, SPRINGER. [22] Cesa-Bianchi N, Gentile C, Vitale F (2009). Fast and optimal prediction of a labeled tree. In 22nd Annual Conference on Learning Theory (Colt 2009). Montreal, Canada, June 18th - 21st, 2009. [23] Cesa-Bianchi N, Gentile C, Orabona F (2009). Robust bounds for classification via selective sampling. In Proceedings of the 26th International Conference on Machine Learning. Montreal, Canada, June 14th- 18th, 2009, p. 121-128, NEW YORK:Omnipress. [24] CESA-BIANCHI N, C. GENTILE (2008). Improved risk tail bounds for online algorithms. IEEE TRANSACTIONS ON INFORMATION THEORY, vol. 54/1, p. 386-390, ISSN: 0018-9448. [25] V. DEL BIANCO, C. GENTILE, L. LAVAZZA (2008). An Evaluation of Function Point Counting Based on Measurement-Oriented Models. In: Giuseppe Visaggio, Maria Teresa Baldassarre, Steve Linkman, Mark Turner. Evaluation and Assessment in Software Engineering EASE 2008, Bari, 26-27 Giugno 2008.. Bari, 26-27 Giugno 2008, vol. pubblicato on-line (http://www.bcs.org/server.php?show=nav.10098), p. 1-10, British Computer Society. [26] Bshouty N., Gentile C. (Eds.) (2008). Machine Learning, special issue on Colt 2007. Di -. BERLIN, HEIDELBERG, NEW YORK:SPRINGER. [27] Cavallanti G, Cesa-Bianchi N, Gentile C (2008). Linear classification and selective sampling under low noise conditions. In Proceedings of the Twenty- Second Annual Conference on Neural Information Processing Systems. Vancouver, British Columbia, Canada, December 8th-11th, 2008, p. 249-256, New York:Curran Associates, Inc. 3

[28] Cavallanti G, Cesa-Bianchi N, Gentile C (2008). Linear algorithms for online multitask classification. In Proceedings of the 21st Annual Conference on Learning Theory - COLT 2008. Helsinki, Finland, July 9-12, 2008, p. 251-262, NEW YORK:Omnipress. [29] Bshouty N., Gentile C. (Eds.) (2007). 20th Conference on Learning Theory. Di -. BERLIN, HEIDELBERG, NEW YORK:SPRINGER, ISBN: 9783540729259. [30] Brotto C, Gentile C, Vitale F (2007). On higher-order Perceptron algorithms. In Proceedings of the 21st conference on Neural Information processing Systems (NIPS 2007). Vancouver, British Columbia, Canada, December 3-6, 2007, New York:Curran Associates, Inc. [31] Cavallanti G, Cesa-Bianchi N, Gentile C (2007). Tracking the best hyperplane with a simple budget perceptron. MACHINE LEARNING, vol. 69 (2-3), p. 143-167 (special issue on COLT 2006), ISSN: 0885-6125. [32] CESA-BIANCHI N, GENTILE C., L. ZANIBONI (2006). Worst-Case Analysis of Selective sampling for linear-threshold algorithms. JOURNAL OF MACHINE LEARNING RESEARCH, vol. 7, p. 1205-1230, ISSN: 1532-4435. [33] CESA-BIANCHI, GENTILE C., L. ZANIBONI (2006). Incremental algorithms for hierarchical classification. JOURNAL OF MACHINE LEARN- ING RESEARCH, vol. 7, p. 31-54, ISSN: 1532-4435. [34] Cesa-Bianchi N, Gentile C, Zaniboni L (2006). Hierarchical classification: combining Bayes with SVM. In Machine Learning, Proceedings of the Twenty-Third International Conference (ICML 2006). Pittsburgh, Pennsylvania, USA,, June 25-29, 2006, p. 177-184, ACM. [35] Cavallanti G, Cesa-Bianchi N, Gentile C (2006). Tracking the best hyperplane with a simple budget perceptron. In Learning Theory: 19th Annual Conference on Learning Theory, COLT 2006. Pittsburgh, PA, USA, June 22-25, 2006, p. 483-498, SPRINGER. [36] CESA-BIANCHI N, CONCONI A, GENTILE C (2005). A second-order perceptron algorithm. SIAM JOURNAL ON COMPUTING, vol. 34, p. 640-668, ISSN: 0097-5397, doi: 10.1137/S0097539703432542. [37] Cesa-Bianch N, Gentile C (2005). Improved risk tail bounds for on-line algorithms. In Advances in Neural Information Processing Systems 18 (Nips 2005). Vancouver, British Columbia, Canada, December 5-8, 2005. [38] Cesa-Bianchi N, Conconi A, Gentile C (2004). On the generalization ability of on-line learning algorithms. IEEE TRANSACTIONS ON IN- FORMATION THEORY, vol. 50, p. 2050-2057, ISSN: 0018-9448, doi: 10.1109/TIT.2004.833339. 4

[39] Cesa-Bianchi N, Gentile C, Zaniboni L (2004). Incremental algorithms for hierarchical classification. In Advances in Neural Information Processing Systems 17 (Nips 2004). Vancouver, British Columbia, Canada, December 13-18, 2004. [40] Cesa-Bianchi N, Gentile C, Zaniboni L (2004). Worst-case analysis of selective sampling for linear-threshold algorithms. In Advances in Neural Information Processing Systems 17 (Nips 2004). Vancouver, British Columbia, Canada, December 13-18, 2004. [41] Cesa-Bianchi N, Conconi A, Gentile C (2004). Regret bounds for hierarchical classification with linear-threshold functions. In Learning Theory,17th Annual Conference on Learning Theory, COLT 2004, Lecture Notes in Computer Science. Banff, Canada, July 1-4, 2004, vol. 3120, p. 93-108, SPRINGER. [42] GENTILE C (2003). The robustness of the p-norm algorithms. MACHINE LEARNING, vol. 53, p. 265-299, ISSN: 0885-6125. [43] Gentile C. (Eds.) (2003). Machine Learning, special issue on Colt 2001. Di -. DORDRECHT:Kluwer Academic Press. [44] Gentile C (2003). Fast feature selection from microarray expression data via multiplicative large margin algorithms. In Advances in Neural Information Processing Systems 16 (Nips 2003). Vancouver, British Columbia, Canada, December 8-13, 2003, MIT Press. [45] Cesa-Bianchi N, Conconi A, Gentile C (2003). Learning probabilistic linearthreshold classifiers via selective sampling. In: Computational Learning Theory and Kernel Machines, 16th Annual Conference on Computational Learning Theory and 7th Kernel Workshop, COLT/Kernel 2003. Washington, DC, USA, August 24-27, 2003, p. 373-387, SPRINGER. [46] Cesa-Bianchi N, Conconi A, Gentile C (2002). Margin-based algorithms for information filtering. In Advances in Neural Information Processing Systems 15 (Nips 2002). Vancouver, British Columbia, Canada, December 9-14, 2002, p. 470-477, MIT Press. [47] Cancedda N, Goutte C, Renders J M, Cesa-Bianchi N, Conconi A, Li Y, Shawe-Taylor J, Vinokourov A, Graepel T, Gentile C (2002). Kernel methods for document filtering. In: The Eleventh Text Retrieval Conference (TREC 2002). Gaithersburg, Maryland, USA, November 19-22, 2002. [48] Cesa-Bianchi N, Conconi A, Gentile C (2002). A second-order Perceptron algorithm. In Computational Learning Theory, 15th Annual Conference on Computational Learning Theory, COLT 2002, Lecture Notes in Computer Science. Sydney, Australia, July 8-10, 2002, vol. 2375, p. 121-137, SPRINGER. 5

[49] Auer P, Cesa-Bianchi N, Gentile C (2002). Adaptive and self-confident on-line learning algorithms. JOURNAL OF COMPUTER AND SYSTEM SCIENCES, vol. 64/1, p. 48-75 (special issue on COLT 2000), ISSN: 0022-0000. [50] GENTILE C. (2001). A new approximate maximal margin clasification algorithm. JOURNAL OF MACHINE LEARNING RESEARCH, p. 213-242, ISSN: 1532-4435. [51] GENTILE C., D. HELMBOLD (2001). Improved lower bounds for learning from noisy examples: an information-theoretic approach. INFORMATION AND COMPUTATION, p. 133-155, ISSN: 0890-5401. [52] Cesa-Bianchi N, Conconi A, Gentile C (2001). On the generalization ability of on-line learning algorithms. In Advances in Neural Information Processing Systems 14 (Nips 2001). Vancouver, British Columbia, Canada, December 3-8, 2001, p. 359-366, MIT Press. [53] B. APOLLONI, GENTILE C. (2000). P-sufficient statistics for PAC learning k-term-dnf formulas through enumeration. THEORETICAL COM- PUTER SCIENCE, p. 1-37, ISSN: 0304-3975. [54] Gentile C (2000). A new approximate maximal margin clasification algorithm. In Advances in Neural Information Processing Systems 13 (Nips 2000). Denver, CO, USA, November 28-30, 2000, p. 500-506, MIT Press. [55] Auer P, Gentile C (2000). Adaptive and self-confident on-line learning algorithms. In Proceedings of the Thirteenth Annual Conference on Computational Learning Theory (COLT 2000). Palo Alto, California, USA, June 28 - July 1, 2000, p. 107-117, Morgan Kaufmann. [56] Gentile C, Littlestone N (1999). The robustness of the p-norm algorithms. In Proceedings of the Twelfth Annual Conference on Computational Learning Theory, COLT 1999. Santa Cruz, CA, USA, July 7-9, 1999, p. 1-11, ACM. [57] B. APOLLONI, C. GENTILE (1998). Sample size lower bounds in PAC learning by algorithmic complexity theory. THEORETICAL COMPUTER SCIENCE, ISSN: 0304-3975. [58] Gentile C, Helmbold D (1998). Improved lower bounds for learning from noisy examples: an information-theoretic approach. In Proceedings of the Eleventh Annual Conference on Computational Learning Theory, COLT 1998. Madison, Wisconsin, USA, July 24-26, 1998, p. 104-115, ACM. [59] Gentile C, Warmuth M (1998). Linear hinge loss and average margin. In Advances in Neural Information Processing Systems 11 (Nips 1998). Denver, Colorado, USA, November 30 - December 5, 1998, p. 225-231, MIT Press. 6