Skip to content
View AlbMLpy's full-sized avatar
πŸ‘¨β€πŸ’»
Developing
πŸ‘¨β€πŸ’»
Developing

Block or report AlbMLpy

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
AlbMLpy/README.md

Hello, I'm Albert! πŸ‘‹

Welcome to my GitHub profile! I'm a currently working on my PhD in Machine Learning, passionate about Sustainable Machine Learning with a particular focus on Tensor Networks based methods and Uncertainty Estimation in ML. I love solving problems, collaborating on open-source projects, and learning new technologies in my free time.


πŸ‘¨β€πŸ’» About Me

  • πŸŽ“ I’m currently a PhD candidate at Delft University of Technology, working on Probabilistic Tensor Methods.
  • πŸ”­ My research interests include: Tensor Networks, Deep Learning, Probabilistic ML, Recommender Systems, Reasoning Models.
  • πŸ‘― I’m looking to collaborate on any topic connected to my research interests!πŸ˜‰
  • πŸ“« How to reach me: Linkedin, Telegram
  • ⚑ Fun fact: Learning Deep Learning with CS231n has been an exciting challenge! Batch normalization still keeps me on my toes, but I'm tackling it one layer at a time!

πŸ› οΈ Technologies & Tools

Here are some of the tools and technologies I frequently use:

  • πŸ’» Languages: Python, SQL, C/C++.
  • πŸ”§ Frameworks & Libraries: Numpy, Jax, PyTorch, Numba; Pandas, Matplotlib; Scikit-Learn, Hugging Face.
  • 🌐 Tools & Platforms: Git, Linux, Docker, CI/CD, Cloud, MLOps.

πŸ“‚ Research & Projects

Here are some of my open-source projects:

  • Tensor Network Based Feature Learning Model - The Feature Learning (FL) model introduces a learnable Canonical Polyadic Decomposition (CPD) for tensor-product features, enabling efficient learning of hyperparameters alongside model parameters using Alternating Least Squares (ALS). Experiments show that FL consistently trains 3-5 times faster than standard cross-validation approaches while achieving comparable prediction quality.
  • Dynamic Collaborative Filtering - TIRecA is a novel collaborative filtering model designed for sequential recommender systems that efficiently updates its parameters with only new data, allowing for incremental addition of users and items. Experiments on four datasets show that TIRecA achieves comparable prediction quality to baseline models while being 10-20 times faster in training.
  • Federated Privacy-Preserving Collaborative Filtering for on-device Next App Prediction - We propose a novel SeqMF model for predicting the next app launch in mobile device usage, addressing the challenges of sequential data, distributed user feedback, and data privacy. By modifying matrix factorization and incorporating federated learning, our model ensures privacy protection while processing user data across devices. Evaluations in static and dynamic environments show that SeqMF provides comparable quality to other methods, with a superior privacy-utility trade-off in dynamic environments that simulate real-world usage.
  • MEKER: Memory Efficient Knowledge Embedding Representation for Link Prediction and Question Answering - We propose MEKER, a memory-efficient Knowledge Graph (KG) embedding model that represents the KG as a 3rd-order binary tensor and utilizes a generalized CP decomposition to optimize memory usage without backpropagation. Our model achieves state-of-the-art performance on link prediction tasks and KG-based Question Answering while reducing memory costs during training.
  • Image Caption Generator - In this rather fun project, we compare encoder-decoder models with different backbones, including CNN, RNN, and Transformers, for the task of image content description generation. Experiments on the Flickr8k dataset and evaluation on COCO show that the best model, DenseNet161 + Transformer, achieves a BLEU@1 score of 62.57, while CNN + LSTM configurations perform faster but lack the quality of Transformer-based models.

Feel free to explore my GitHub repositories for more research-related projects!


πŸ“§ Contact & Social Media

πŸ’¬ Let’s Connect!


Popular repositories Loading

  1. Sustainable_Industry_Contest Sustainable_Industry_Contest Public

    Here you can find my experiments on Sustainable Industry Contest

    Jupyter Notebook 2

  2. Recommender-Systems Recommender-Systems Public archive

    Jupyter Notebook 2

  3. IDS_Task IDS_Task Public

    Here I am trying to solve classic classification problem on FLU data

    Jupyter Notebook 1

  4. Statistical_Natural_Language_Processing Statistical_Natural_Language_Processing Public

    Jupyter Notebook 1

  5. Fake_News_Detection Fake_News_Detection Public

    Jupyter Notebook 1 1

  6. BookSpark BookSpark Public

    Innovation Workshop 2021 at Skoltech.

    Jupyter Notebook 1