I guess there is a technical question and a big picture question here. Big picture, one could just allocate everything you might ever need as static variables which simply exist so long as the program exists. And your note about gigabytes of RAM certainly suggests that a simple "allocation of everything at compile time" could work for programs where it would not work so well 30 years ago. But there are a lot of good technical reasons for malloc(), just the same. But on a kind of big picture level the reason is that (1) programs may be able to handle a wider array of circumstances and (2) operating systems will almost certainly work better when there are lots of programs kept running or available. Used to be you just ran one program at a time, perhaps a sequence of them in a "batch." Today, workstations let you copy and paste and keep lots of programs, drivers, background routines, and these often cooperate with each other in novel ways that perhaps even the programmers themselves hadn't considered. So it's a good idea on that level.
The technical side is more of a "lifetime" thing. In C, (1) variables may exist for the duration of a local function's lifetime (local lifetime variables); (2) may exist all the time the program exists and is running (static lifetime variables, which are either defined at module level or else are defined within a function but using 'static' as a keyword); and (3) may exist for a period of time longer than a function itself, but shorter than the entire program life -- these are the 'heap' variables and they are allocated using malloc. A function using malloc can allocate some memory and return a pointer to it, allowing some other routine to decide when to destroy it later on. There is value in that even with gigabytes of memory.
Microsoft's Word program creates a document instance when you create a new document. It can't do that as a local function variable -- as soon as the function exits, it's gone. It could do that as some large, but mostly unused static lifetime array of the maximum number of documents you are allowed to open. But then the program memory footprint would be huge, making some of its management by the operating system a little more difficult. And most of the time far, far less would be needed anyway. So why pressure the memory system (or the page file on disk?)
I'm not sure what kind of short example program would help you here. All it would do is pony up some idealized (and not practical) example that would make no point by itself.