Tensor Networks (TNs) are factorizations of high order tensors into networks of low order tensors, which have been studied in quantum physics, chemistry and applied mathematics. TNs have been increasingly applied to machine learning and AI fields for modeling mutlimodal data, compressing deep neural networks, and scaling up algorithms to high-dimensional data. In particular, TNs have been successfully used to tackle challenging problems in data completion, model compression, multimodal fusion, multitask learning and learning theory. TNs are rapidly emerging and finding many interesting applications in machine learning, including modeling probability functions and implementing efficient TN computations in GPU. However, the topic of TNs in machine learning is relatively young and many open problems are still to be explored. This workshop will promote discussions among researchers investigating innovative TNs technology from fundamental theory and algorithms for ML and deep learning, and applications in computer vision, biomedical image processing, NLP, etc.