Alon Orlitsky - Publications

Primary publications

  1. Communication with secrecy constraints, A. Orlitsky and A. El Gamal, Proceedings 16th ACM Symposium on Theory of Computing, May 1984, pp. 217-224.
  2. Interactive data compression, A. El Gamal and A. Orlitsky, Proceedings 25th IEEE Symposium on Foundations of Computer Science, October 1984, pp. 100-108.
  3. Communication complexity, A. Orlitsky and A. El Gamal, Complexity in Information Theory, Y. Abu Mustafa (editor), Springer-Verlag, 1986, pp. 16-61.
  4. Communication issues in distributed computing, A. Orlitsky, Ph.D. Thesis, Electrical Engineering Department, Stanford University, 1986.
  5. Self-avoiding random loops, L. A. Dubins, A. Orlitsky, J. A. Reeds and L. A. Shepp, IEEE Transactions Information Theory, IT-34:6 (November 1988), pp. 1509-1516.
  6. On the evolution of islands, P. G. Doyle, C. Mallows, A. Orlitsky and L. A. Shepp, Israel Journal of Math, 67:1 (1989), pp. 34-41.
  7. Feedback in discrete communication, A. Orlitsky, DIMACS Series in Discrete Mathematics and Theoretical Computer Science, Volume 2: Distributed Computing and Cryptography, J. Feigenbaum and M. Merritt (editors), AMS-ACM, 1989, pp. 213-228.
  8. On curbing virus propagation, A. Orlitsky and L. A. Shepp, BL011217-891218-21TM, 1989.
  9. Average and randomized communication complexity, A. Orlitsky and A. El Gamal, IEEE Transactions Information Theory, 36:1 (January 1990), pp. 3-16.
  10. Monte carlo generation of self-avoiding walks with fixed endpoints and fixed length, N. Madras, A. Orlitsky and L. A. Shepp, Journal of Statistical Physics, 58 (1990), pp. 159-183.
  11. A spectral lower bound technique for the size of decision trees and two-level and/or circuits, Y. Brandman, A. Orlitsky and J. Hennessy, IEEE Transactons on Computers, 39:2 (1990), pp. 282-287.
  12. Worst-case interactive communication I: Two messages are almost optimal, A. Orlitsky, IEEE Transactions on Information Theory, 36:5 (September 1990), pp. 1111-1126.
  13. Worst-case interactive communication II: Two messages are not optimal, A. Orlitsky, IEEE Transactions on Information Theory, 37:4 (July 1991), pp. 995-1005.
  14. Counting intermodulation products for AMPS cellular radio frequency assignments, J. E. Mazo and A. Orlitsky, technical momorandum, BL011217-920207-02TM, 1992.
  15. Average-case interactive communication, A. Orlitsky, IEEE Transactions on Information Theory, 38:4 (July 1992), pp. 1534-1547.
  16. Secrecy enhancement via public discussion, A. Orlitsky and A. Wigderson, IEEE International Symposium on Information Theory, January 1993.
  17. Asymptotic component densities in programmable gate arrays realizing all circuits of a given size, T. Berger, H. Hekstra and A. Orlitsky, Algorithmica, 9:2 (February 1993), pp. 101-127.
  18. On data compression with side information, A. Orlitsky, IEEE International Information Theory Workshop, 1993.
  19. Three results on interactive communication, M. Naor, A. Orlitsky and P. Shor, IEEE Transactions on Information Theory, 39:5 (September 1993), pp. 1608-1615.
  20. Interactive communication of balanced distributions and of correlated files, A. Orlitsky, SIAM J. of Dis. Math., 6:4 (November 1993), pp. 548-564.
  21. Privacy, additional information, and communication, R. Bar-Yehuda, B. Chor, E. Kushilevitz and A. Orlitsky, IEEE Transactions Information Theory, 39:6 (November 1993), pp. 1930-1943.
  22. Lower bounds on threshold and related circuits via communication complexity, V. P. Roychowdhury, A. Orlitsky and K. Y. Siu, IEEE Transactions on Information Theory, 40:2 (March 1994), pp. 467-474.
  23. Neural models and spectral methods, V. P. Roychowdhury, K. Y. Siu and A. Orlitsky, Theoretical Advances in Neural Computation and Learning, V. P. Roychowdhury, K. Y. Siu and A. Orlitsky (editors), Kluwer Academic Publishing, 1994.
  24. A lower bound on the expected length of one-to-one codes, N. Alon and A. Orlitsky, IEEE Transactions on Information Theory, 40:4 (September 1994), pp. 1670-1672.
  25. Vector analysis of threshold functions, V. Roychowdhury, K. Y. Siu, A. Orlitsky and T. Kailath, Information and Computation, 120:1 (July 1995), pp. 22-31.
  26. Repeated communication and Ramsey graphs, N. Alon and A. Orlitsky, IEEE Transactions on Information Theory, 41:5 (September 1995), pp. 1276-1289.
  27. On edge-colored interior planar graphs on a circle and the expected number of RNA secondary structures, A. Orlitsky and S. S. Venkatesh, Discrete Applied Math., 64(2):151-178, January 1996.
  28. Two UNIX privacy flaws, A. Orlitsky, E. Telatar and, BL011317-96031909TM,.
  29. Source coding and graphs entropies, N. Alon and A. Orlitsky, IEEE Transactions on Information Theory, IT-42:5 (September 1996), pp. 1329-1339.
  30. Design of shapes for precise image registration, A. M. Bruckstein, A. Orlitsky and L. O'Gorman, IEEE Transactions on Information Theory, 44:7 (November 1998), pp. 3156-3162.
  31. Zero-error information theory, J. Korner and A. Orlitsky, IEEE Transactions on Information Theory 44:6 (October 1998), pp. 2207-2229.
  32. Server assisted speech recognition over the internet, A. Klautau, N. Jevtic, and A. Orlitsky, International Conference on Acoustics, Speech, and Signal Processing (ICASSP), pp. 3690-3693, June 2000.
  33. On codes that avoid specified differences, B. Moision, A. Orlitsky, and P. Siegel, IEEE Transactions on Information Theory, 47:1 (January 2001), pp. 433-442.
  34. Coding for computing, A. Orlitsky and J. R. Roche, IEEE Transactions on Information Theory, 47:3 (March 2001), pp. 903-917.
  35. Information Theory, A. Orlitsky, Encyclopedia of Physical Science and Technology, Academic press, 2001.
  36. Practical protocols for interactive communication, A. Orlitsky and K. Viswanathan, IEEE International Symposium on Information Theory, June 2001.
  37. Estimated rank pruning and Java-based speech recognition, N. Jevtic, A. Klautau, and A. Orlitsky, IEEE Automatic Speech Recognition and Understanding Workshop, pp. 401-404, December 2001.
  38. Scalar vs. vector quantization: worst-case analysis, A. Orlitsky, IEEE Transactions on Information Theory, 48:6 (June 2001), pp. 1393-1409.
  39. Stopping sets and the girth of Tanner graphs, A. Orlitsky, R. Urbanke, K. Viswanathan, and J. Zhang, IEEE International Symposium on Information Theory, June 2002.
  40. Universal compression of unknown alphabets, N. Jevtic, A. Orlitsky, and N.P. Santhanam, IEEE International Symposium on Information Theory, June 2002.
  41. Finite length analysis of large- and irregular-left degree LDPC codes over erasure channels, J. Zhang and A. Orlitsky, IEEE International Symposium on Information Theory, June 2002.
  42. Combined binary classifiers with applications to speech recognition, A. Klautau, N. Jevtic, and A. Orlitsky, International Conference on Spoken Language Processing, Volume 4, pp. 2469-2472, September 2002.
  43. On nearest-neighbor ECOC with applications to all-pair multiclass SVM, A. Klautau, N. Jevtic, and A. Orlitsky, Journal of Machine Learning Research, vol. 4, iss. 1, April 2003, pp. 1-15.
  44. One-way communication and error-correcting codes, A. Orlitsky and K. Viswanathan, IEEE Transactions on Information Theory, 49:7 (July 2003), pp 1781-1788.
  45. On capacity-achieving sequences of degree distributions and their optimality, A. Orlitsky, K. Viswananathan, and J. Zhang, International Symposium on Information Theory, June 2003.
  46. Discriminative gaussian mixture models: a comparison with kernel classifiers, A. Klautau, N. Jevtic, and A. Orlitsky, 20th International Conference on Machine Learning, pp. 353-360, August 2003.
  47. Recent results on compression of large alphabets, A.K. Dhulipala, A. Orlitsky, and Sajama, The Allerton Conference, October 2003.
  48. Always Good Turing: asymptotically optimal probability estimation, A. Orlitsky, N.P. Santhanam, and J. Zhang, Science, vol 302, no. 5644, pp. 427-431, October 2003.
  49. Always Good Turing: asymptotically optimal probability estimation, A. Orlitsky, N.P. Santhanam, and J. Zhang, Proceedings of the 44th Anual Symposium on Foundations of Computer Science, October 2003.
  50. On the relation between additive smoothing and universal coding, N. Jevtic and A. Orlitsky, IEEE Automatic Speech Recognition and Understanding Workshop, December 2003
  51. Low (size) and order in distribution modeling, A. Orlitsky, N.P. Santhanam, K. Viswanathan, and J. Zhang, Conference on Information Sciences and Systems, March 2004.
  52. Algorithms for modeling distributions over large alphabets, A. Orlitsky, Sajama, N.P. Santhanam, K. Viswanathan, and J. Zhang, IEEE International Symposium on Information Theory, July 2004.
  53. Relative redundancy: a more stringent performance guarantee for universal compression, A. Orlitsky, N.P. Santhanam, and J. Zhang, IEEE International Symposium on Information Theory, July 2004.
  54. Universal compression of memoryless sources over unknown alphabets, A. Orlitsky, N.P. Santhanam, and J. Zhang, IEEE Transactions on Information Theory, IT-50:7 (July 2004), pp. 1469-1481.
  55. On modeling profiles instead of values, A. Orlitsky, N.P. Santhanam, K. Viswanathan, and J. Zhang, 20th Conference on Uncertainty in Artificial Intelligence, July 2004, pp. 426-435.
  56. Speaking of infinity, A. Orlitsky and N.P. Santhanam, IEEE Transactions on Information Theory, IT-50:10 (October 2004), pp. 2215-2230.
  57. Semi-parametric exponential family PCA, Sajama and A. Orlitsky, Neural Information Processing Conference, pp. 1177-1184, November 2004.
  58. A lower bound on compression of unknown alphabets, N. Jevtic, A. Orlitsky, and N.P. Santhanam, Theoretical Computer Science-332, February 2005, pp. 293-311.
  59. Stopping set distribution of LDPC code ensembles, A. Orlitsky, K. Viswananathan, and J. Zhang, IEEE Transactions on Information Theory, IT-51:3 (March 2005), pp. 929-953.
  60. Estimating and computing density based distance metrics, Sajama and A. Orlitsky, 22nd International Conference on Machine Learning, August 2005.
  61. Supervised dimensionality reduction using mixture models, Sajama and A. Orlitsky, International Conference on Machine Learning, August 2005.
  62. Convergence of profile-based estimators, A. Orlitsky, N.P. Santhanam, K. Viswanathan, and J. Zhang, International Symposium on Information Theory, September 2005.
  63. Redundancy over continuous alphabets, A. Orlitsky and Prasad Santhanam, The Allerton Conference, September 2005.
  64. Theoretical and experimental results on modeling low probabilities, A. Orlitsky, N.P. Santhanam, K. Viswanathan, and J. Zhang, Information Theory Workshop March 2006.
  65. Limit results on pattern entropy, A. Orlitsky, N.P. Santhanam, K. Viswanathan, and J. Zhang, IEEE Transactions on Information Theory, IT-52:7 (July 2006) pp. 2954-2964.
  66. Relative redundancy for large alphabets, A. Orlitsky, N. Santhanam, and J. Zhang, IEEE International Symposium on Information Theory, July 2006.
  67. Universal compression of Markov and related sources over arbitrary alphabets, A.K. Dhulipala and A. Orlitsky, IEEE Transactions on Information Theory, IT-53:9 (September 2006) pp. 4182-4190.
  68. Modifying Distances, Sajama and A. Orlitsky, Semi-Supervised Learning, O. Chapelle, B. Scholkopf, and A. Zien (Editors), pp. 309-330, The MIT Press, 2006.
  69. On codes with local joint constraints, B. Moision, A. Orlitsky, and P. Siegel, Linear Algebra and its Applications, 422 (2) (April 2007), pp. 442-454.
  70. Further results on relative redundancy, H. Das, A. Orlitsky, N. Santhanam, and J. Zhang, IEEE International Symposium on Information Theory, July 2008.
  71. The maximum likelihood probability of skewed patterns, A. Orlitsky and S. Pan, In Proceedings of the International Symposium on Information Theory (ISIT), pp. 1130–1134, June 2009.
  72. The maximum likelihood probability of unique-singleton, ternary, and length-7 patterns, J. Acharya, A. Orlitsky, and S. Pan, In Proceedings of the International Symposium on Information Theory (ISIT), pp. 1135–1139, June 2009.
  73. Silence-based communication, A.K. Dhulipala, C. Fragouli, and A. Orlitsky, IEEE Transactions on Information Theory, IT-56(1):350–366, January 2010.
  74. On reconstructing a string from its substring compositions, J. Acharya, H. Das, O. Milenkovic, A. Orlitsky, and S. Pan, In Proceedings of the International Symposium on Information Theory (ISIT), pp. 1238–1242, June 2010.
  75. Exact calculation of pattern probabilities, J. Acharya, H. Das, H. Mohimani, A. Orlitsky, and S. Pan, In Proceedings of the International Symposium on Information Theory (ISIT), pp. 1498–1502, June 2010.
  76. Competitive closeness testing, J. Acharya, H. Das, A. Jafarpour, A. Orlitsky, and S. Pan., Journal on Machine Learning Research (JMLR): Workshop and Conference Proceedings, 19:47–68, June 2011. (Appeared at the 24th Annual Conference on Learning Theory.)
  77. Algebraic computation of pattern maximum likelihood, J. Acharya, H. Das, A. Orlitsky, and S. Pan, In Proceedings of the International Symposium on Information Theory (ISIT), pp. 400–404, July 2011.
  78. Competitive classification and closeness testing, J. Acharya, H. Das, A. Jafarpour, A. Orlitsky, S. Pan, and A. Suresh, Journal on Machine Learning Research (JMLR): Workshop and Conference Proceedings, 23:22.1–22.18, June 2012. (Appeared at the 25th Annual Conference on Learning Theory).
  79. Estimating multiple concurrent processes, J. Acharya, H. Das, A. Jafarpour, A. Orlitsky, and S. Pan, In Proceedings of the International Symposium on Information Theory (ISIT), pp. 1628–1632, July 2012.
  80. On the query computation and verification of functions, H. Das, A. Jafarpour, A. Orlitsky, S. Pan, and A. Suresh, In Proceedings of the International Symposium on Information Theory (ISIT), pp. 2711–2715, July 2012.
  81. Tight bounds on profile redundancy and distinguishability, J. Acharya, H. Das, and A. Orlitsky, In Proceedings of the 26th Annual Conference on Neural Information Processing (NIPS), November 2012.
  82. A competitive test for uniformity of monotone distributions, J. Acharya, A. Jafarpour, A. Orlitsky, and A. T. Suresh, Journal on Machine Learning Research (JMLR): Workshop and Conference Proceedings, 31:57-65, 2013. (Appeared at the 16th International Conference on Artificial Intelligence and Statistics.) April, 2013.
  83. Optimal Probability Estimation with Applications to Prediction and Classification, J. Acharya, A. Jafarpour, A. Orlitsky, and A. T. Suresh, Journal of Machine Learning Research (JMLR: Workshop and Conference ProceedingsP, 30:764-796, June 2013 . (Appeared at the 26th Annual Conference on Learning Theory).
  84. Tight Bounds for Universal Compression of Large Alphabets, J. Acharya, A. Jafarpour, A. Orlitsky, and A. T. Suresh, In Proceedings of the International Symposium on Information Theory (ISIT), pp. 2875-2879, July 2013.
  85. Quadratic-backtracking algorithm for string reconstruction from substring compositions , J. Acharya, H. Das, O. Milenkovic, A. Orlitsky, S. Pan In Proceedings of the International Symposium on Information Theory (ISIT), pp. 1296-1300, July 2014.
  86. Sorting with adversarial comparators and application to density estimation , J. Acharya, A. Jafarpour, A. Orlitsky, and A. T. Suresh, In Proceedings of the International Symposium on Information Theory (ISIT), pp. 1682-1686, July 2014.
  87. Efficient compression of monotone and m-modal distributions , J. Acharya, A. Jafarpour, A. Orlitsky, and A. T. Suresh, In Proceedings of the International Symposium on Information Theory (ISIT), pp. 1867-1871, July 2014.
  88. Poissonization and universal compression of envelope classes , J. Acharya, A. Jafarpour, A. Orlitsky, and A. T. Suresh, In Proceedings of the International Symposium on Information Theory (ISIT), pp. 1872-1876, July 2014.
  89. Sublinear algorithms for outlier detection and generalized closeness testing , J. Acharya, A. Jafarpour, A. Orlitsky, and A. T. Suresh, In Proceedings of the International Symposium on Information Theory (ISIT),
  90. pp. 3200-3204, July 2014.
  91. Near-optimal-sample estimators for spherical Gaussian mixtures, J. Acharya, A. Jafarpour, A. Orlitsky, and A. T. Suresh, Submitted to NIPS 2014.

Edited book

  1. Theoretical advances in neural computation and learning, V. P. Roychowdhury, K. Y. Siu and A. Orlitsky, editors, Kluwer Academic Publishing, 1994.

Additional publications

  1. Interactive data compression, A. El Gamal and A. Orlitsky, IEEE International Symposium on Information Theory, June 1985.
  2. Communication complexity, A. Orlitsky and A. El Gamal, IEEE International Symposium on Information Theory, October 1986.
  3. Optimal computer organization for basic machine models, T. Berger, A. Hekstra and A. Orlitsky, IEEE International Symposium on Information Theory, June 1988.
  4. Feedback in discrete communication, IEEE International Symposium on Information Theory, January 1990.
  5. Two messages are almost optimal for conveying information, A. Orlitsky, Proceedings of the 9th Annual ACM Symposium on Principles of Distributed Computing, August 1990, pp. 219-232.
  6. Improved bounds on interactive communication, A. Orlitsky and M. Naor, IEEE International Symposium on Information Theory, June 1991.
  7. Interactive communication of balanced distributions, A. Orlitsky, IEEE International Symposium on Information Theory, June 1991.
  8. Interactive communication: balanced distributions, correlated files, and average-case complexity, A. Orlitsky, Proceedings 32nd IEEE Symposium on Foundations of Computer Science, 1991, pp. 228-238.
  9. On the circuit complexity of neural networks, V. P. Roychowdhury, K. Y. Siu, A. Orlitsky and T. Kailath, 4th Annual Conference on Neural Information Processing Systems (NIPS), 1990, pp. 953-959.
  10. A geometric approach to threshold-circuit complexity, V. P. Roychowdhury, K. Y. Siu, A. Orlitsky and T. Kailath, Proceedings of the 4th Annual Workshop on Computational Learning Theory, L. G. Valiant and M. K. Warmuth (editors), Morgan Kaufman Publishers, Inc., 1991, pp. 97-111.
  11. Average-case interactive communication, A. Orlitsky, Sequences II, R. Capocelli, A. De Santis and U. Vaccaro (editors), Springer-Verlag, 1991, pp. 79-103.
  12. Data compression with side information and graph entropy, A. Orlitsky, IEEE International Symposium on Information Theory, January 1993.
  13. Zero-error channel and source coding, N. Alon and A. Orlitsky, IEEE International Symposium on Information Theory, June 1994.
  14. A lower bound on the expected length of one-to-one codes, N. Alon and A. Orlitsky, IEEE International Symposium on Information Theory, June 1994.
  15. Open and solved graph problems in data compression, A. Orlitsky, Hypergraphs and Symmetric Structures Janos Bolyai Math. Soc., June 1995.
  16. Coding for computing, A. Orlitsky and J. R. Roche, IEEE International Symposium on Information Theory, September 1995.
  17. Coding for computing, A. Orlitsky and J. R. Roche, 36th Annual Symposium on Foundations of Computer Science, November 1995, pp. 502-511.
  18. A discrete approach to quantization and source coding, A. Orlitsky, IEEE Information Theory Workshop, June, 1996.
  19. Zero-error information theory, J. Korner and A. Orlitsky, Information Theory: 50 Years of Discovery, S. Verdu and S. McLaughlin, Editors, IEEE Press, 1999, pp. 163-185.
  20. On codes that avoid specified differences, B. Moision, A. Orlitsky, and P. Siegel, pp. 878-882, Globecom, December 1999.
  21. Worst-case rate of scalar vs. vector quantization, A. Orlitsky, IEEE International Symposium on Information Theory, June 2000.
  22. One-way communication and error-correcting codes, A. Orlitsky and K. Viswanathan, IEEE International Symposium on Information Theory, June 2002.
  23. Performance of universal codes over infinite alphabets, A. Orlitsky and N.P. Santhanam, Proceedings of the Data Compression Conference, pp. 402-410, March 2003.
  24. On compression and modeling of sparse data, A. Orlitsky, N.P. Santhanam, K. Viswanathan, and J. Zhang, 3rd Asian European Workshop on Coding and Information Theory, June 2003.
  25. Bounds on compression of unknown alphabets, A. Orlitsky, N.P. Santhanam, and J. Zhang, International Symposium on Information Theory, June 2003.
  26. Stopping set distribution of LDPC code ensembles, A. Orlitsky, K. Viswananathan, and J. Zhang, International Symposium on Information Theory, June 2003.
  27. On the redundancy of HMM patterns, A.K. Dhulipala and A. Orlitsky, IEEE International Symposium on Information Theory, July 2004.
  28. An information theoretic approach to modeling low probabilities, A. Orlitsky, N.P. Santhanam, K. Viswanathan, and J. Zhang, The Allerton Conference, October 2004.
  29. Limit results on pattern entropy, A. Orlitsky, N.P. Santhanam, K. Viswanathan, and J. Zhang, Information Theory Workshop, October 2004.
  30. Innovation and pattern entropy of stationary processes, A. Orlitsky, N.P. Santhanam, K. Viswanathan, and J. Zhang, International Symposium on Information Theory, September 2005.
  31. Silence is golden and time is money: power-aware communication for sensor networks, C. Fragouli and A. Orlitsky, The Allerton Conference, September 2005.
  32. Silence based communication for sensor networks, A.K. Dhulipala, C. Fragouli, and A. Orlitsky, IEEE International Symposium on Information Theory, July 2006.
  33. Population estimation with performance guarantees, A. Orlitsky, N.P. Santhanam, and K. Viswanathan, Proceedings of the International Symposium on Information Theory (ISIT), pp. 2026–2030, July 2007.
  34. Single versus multiple rounds for distributed function computation, A.K. Dhulipala, C. Fragouli, and A. Orlitsky, In Proceedings of the International Symposium on Information Theory (ISIT), July 2007.
  35. New tricks for old dogs: Large alphabet probability estimation, N.P. Santhanam, A. Orlitsky, and K. Viswanathan, In Proceedings of the Information Theory Workshop (ITW), pp. 638–643, 2007.
  36. Further results on relative redundancy, H. Das, A. Orlitsky, N.P. Santhanam, and J. Zhang, In Proceedings of the International Symposium on Information Theory (ISIT), pp. 1940–1943, July 2008.
  37. Recent results on pattern maximum likelihood, J. Acharya, A. Orlitsky, and S. Pan, In Proceedings of the Information Theory Workshop (ITW), pp. 251–255, June 2009.
  38. Deterministic calculation of pattern probabilities, J. Acharya, H. Das, H. Mohimani, A. Orlitsky, and S. Pan, In Proceedings of the Information Theory Workshop (ITW), June 2010.
  39. Classification using pattern probability estimators, J. Acharya, H. Das, A. Orlitsky, S. Pan, and N.P. Santhanam, In Proceedings of the International Symposium on Information Theory (ISIT), pp. 1493–1497, June 2010.
  40. Expected query complexity of symmetric boolean functions, J. Acharya, A. Jafarpour, and A. Orlitsky, In Proceedings of the 49th Annual Allerton Conference, pp. 26–29, September 2011. (Invited paper).