Dealing with problems of network security in the information age is absolutely necessary so that data exchanged or stored in a network can be maintained from the threat. Numerous studies have shown that the use of Intrusion Detection System (IDS)-based machine learning can be used to overcome the accuracy problem. Dimensionality reduction approach to optimize the detection process has also become the focus. However, there is still a need to improve the results of previous research. In this paper, we propose a method to perform dimensionality reduction which can optimize the intrusion detection process by limiting the size of the clusters and the usage of sub-medoids to form new features. The results of this research show that the proposed system is able to provide better results than the existing. This can also be depicted by the increase of sensitivity and specificity values.