import numpy as np
embeddings = np.array([
[0.9, 0.1, 0.2],
[0.1, 0.8, 0.3],
[0.7, 0.2, 0.4],
])
query = np.array([0.8, 0.2, 0.1])
normalized_embeddings = embeddings / np.linalg.norm(embeddings, axis=1, keepdims=True)
normalized_query = query / np.linalg.norm(query)
cosine_similarity = normalized_embeddings @ normalized_query
ranked_indices = np.argsort(cosine_similarity)[::-1]
print(cosine_similarity)
print(ranked_indices)
A lot of machine learning reduces to linear algebra with better tooling. Dot products, norms, matrix multiplication, and projections show up in recommendation, embeddings, PCA, and optimization. I keep the implementation small and testable so it stays understandable when somebody revisits it months later.