As the network continues to become more complex due to the increased number of devices and ubiquitous connectivity, the trend is shifting from a centralized implementation to decentralization. Similarly, strategies to secure networks are increasingly leaning towards decentralization for its potential to enhance security in future networks with the help of Machine Learning (ML) techniques. In this regard, Distributed Machine Learning (DML) techniques, such as Federated Learning (FL) and Split Learning (SL), are at the forefront of this shift, offering collaborative learning capabilities across network nodes while maintaining data privacy. However, ML requires vast amounts of dedicated computing, memory, bandwidth, and as a consequence, energy resources. Moreover, resource consumption ML techniques used for network security have mostly been overlooked, which presents a glaring challenge for future networks in terms of overall resource utilization. This research emphasizes the importance of understanding the resource consumption patterns of two important DML techniques, i.e., FL and SL, to analyze the consumption of critical resources when deployed for network security. Furthermore, this research draws important insights from a practical comparative analysis of FL and SL in terms of resource consumption patterns and discusses their scope for future network security, such as in 6G, and stirs further research in this area.