Cluster Analysis Quiz 4

Test your knowledge with this Cluster Analysis Quiz featuring MCQs on k-means, k-medoids, k-means++, and k-median algorithms, along with key concepts like Manhattan distance, cosine similarity, CF tree split, and multi-class classification. Perfect for machine learning enthusiasts and data science learners to assess their understanding of unsupervised clustering techniques. Take the Cluster Analysis Quiz now and sharpen your skills!

Online Unsupervised machine learning technique cluster analysis quiz with answers

Online Unsupervised Machine Learning Cluster Analysis Quiz with Answers

1. Considering the k-means algorithm, after the current iteration, we have three centroids (0, 1), (2, 1), and (-1, 2). Will points (0.5, 0.5) and (-0.5, 0) be assigned to the same cluster in the next iteration?

 
 

2. Which of the following statements, if any, is FALSE?

 
 
 
 

3. When dealing with multi-class classification problems, which loss function should be used?

 
 
 
 

4. What are some common considerations and requirements for cluster analysis?

 
 
 
 

5. Given the two-dimensional points (0, 3) and (4, 0), what is the Manhattan distance between those two points?

 
 
 
 

6. In the k-medoids algorithm, after computing the new center for each cluster, is the center always guaranteed to be one of the data points in that cluster?

 
 

7. The k-means++ algorithm is designed for better initialization for k-means, which will take the farthest point from the currently selected centroids. Suppose $k = 2$, and we have selected the first centroid as (0, 0). Among the following points (these are all the remaining points), which one should we take for the second centroid?

 
 
 
 

8. If you need to choose between clustering and supervised learning for the following applications, which would you choose, clustering over supervised learning?

 
 
 
 

9. Considering the k-means algorithm, if points (0, 3), (2, 1), and (-2, 2) are the only points that are assigned to the first cluster now, what is the new centroid for this cluster?

 
 
 
 

10. For k-means, will different initializations always lead to different clustering results?

 
 

11. Suppose $X$ is a random variable with $P(X = -1) = 0.5$ and $P(X = 1) = 0.5$. In addition, we have another random variable $Y=X*X$. What is the covariance between $X$ and $Y$?

 
 
 

12. Is K-means guaranteed to find K clusters that lead to the global minimum of the SSE?

 
 

13. Which of the following statements is true?

 
 
 
 

14. When will a leaf entry in the CF tree split?

 
 

15. Given three vectors $A$, $B$, and $C$, suppose the cosine similarity between $A$ and $B$ is $cos(A, B) = 1.0$, and the similarity between $A$ and $C$ is $cos(A, C) = -1.0$. Can we determine the cosine similarity between $B$ and $C$?

 
 

16. Which of the following statements is true?

 
 
 
 

17. Which of the following statements is true?

 
 

18. Is it possible that the SSE strictly increases after we recompute new centers in the k-means algorithm? Why?

 
 

19. Considering the k- median algorithm, if points (-1, 3), (-3, 1), and (-2, -1) are the only points that are assigned to the first cluster now, what is the new centroid for this cluster?

 
 
 
 

20. Which of the following statements about k-medoids, k-median, and k-modes algorithms is correct?

 
 
 
 

Question 1 of 20

Online Cluster Analysis Quiz with Answers

  • Is K-means guaranteed to find K clusters that lead to the global minimum of the SSE?
  • When dealing with multi-class classification problems, which loss function should be used?
  • Is it possible that the SSE strictly increases after we recompute new centers in the k-means algorithm? Why?
  • For k-means, will different initializations always lead to different clustering results?
  • In the k-medoids algorithm, after computing the new center for each cluster, is the center always guaranteed to be one of the data points in that cluster?
  • Which of the following statements is true?
  • What are some common considerations and requirements for cluster analysis?
  • Which of the following statements is true?
  • If you need to choose between clustering and supervised learning for the following applications, which would you choose, clustering over supervised learning?
  • Which of the following statements is true?
  • Given the two-dimensional points (0, 3) and (4, 0), what is the Manhattan distance between those two points?
  • Given three vectors $A$, $B$, and $C$, suppose the cosine similarity between $A$ and $B$ is $cos(A, B) = 1.0$, and the similarity between $A$ and $C$ is $cos(A, C) = -1.0$. Can we determine the cosine similarity between $B$ and $C$?
  • Suppose $X$ is a random variable with $P(X = -1) = 0.5$ and $P(X = 1) = 0.5$. In addition, we have another random variable $Y=X*X$. What is the covariance between $X$ and $Y$?
  • Considering the k-means algorithm, after the current iteration, we have three centroids (0, 1), (2, 1), and (-1, 2). Will points (0.5, 0.5) and (-0.5, 0) be assigned to the same cluster in the next iteration?
  • Considering the k-means algorithm, if points (0, 3), (2, 1), and (-2, 2) are the only points that are assigned to the first cluster now, what is the new centroid for this cluster?
  • The k-means++ algorithm is designed for better initialization for k-means, which will take the farthest point from the currently selected centroids. Suppose $k = 2$, and we have selected the first centroid as (0, 0). Among the following points (these are all the remaining points), which one should we take for the second centroid?
  • Considering the k- median algorithm, if points (-1, 3), (-3, 1), and (-2, -1) are the only points that are assigned to the first cluster now, what is the new centroid for this cluster?
  • Which of the following statements about k-medoids, k-median, and k-modes algorithms is correct?
  • Which of the following statements, if any, is FALSE?
  • When will a leaf entry in the CF tree split?

Try Deep Learning Quizzes

Leave a Comment

Discover more from Statistics for Data Science & Analytics

Subscribe now to keep reading and get access to the full archive.

Continue reading