When variables or an instance of an object (in Software Application) are declared globally in a class for whatever reason, Memory gets allocated to those variables, the larger the processes ( Big O analysis) the more memory is allocated to handle the function behavior.
Therefore, it is at this time that Developers should take time to evaluate how the claimed memory is managed or collected in the application. Some programming languages like C# has automatic Garbage Collection (GC) that makes the process of executing code fast and Developer normally don't worry about GC. However, this should be a topic to consider when Developing and knowing how garbage and claimed Memory gets returned is very important when coding intuitive apps.
It is considered the bad practice to declare variables in a global scope because the application/framework could stress itself collecting garbage across sessions and this could impact the performance. Therefore, many Developers tend to declare variables inside the function, this way when the function returns some value, all the variables declared inside the function gets destroyed and Memory is reclaimed by the Compiler.
In this case, declaring variables inside the function is always advised unless in some instance where you need to access global variables from other classes.
Algorithm becomes important in Computer Science because of loops in the application, it seems like loops are the main contender of memory leak, in this case, a developer would actually take time to come up with the best plan that is less costly. Memory hog application can cost a company in sales (losing customers) and also hardware upgrades. Nevertheless, whenever you are developing your application, minimize on loops (for loops, while loop (worse), foreach (best), do while (not widely used)).