Order of items in a very basic robots.txt changing the command scope?

I had a very simple robots.txt file setup for a site I maintain. After a spike of traffic that the ISP put down to crawlers they suggested I add a crawl delay directive which is fair enough. So I ended up with this file

User-agent: *
Disallow: /a-page-i-wanted-to-ignore
Crawl-delay: 1

I still receive spikes in traffic that are causing downtime. The ISP told me that this directive has only defined the crawl delay for the page /a-page-i-wanted-to-ignore

I wanted to check, is that correct? Is a command like crawl-delay placed under a disallow causing it to be specific to the disallow clause ?