The amount of disk space really depends on the chapter data. The latest core collection in English has a size of 10 GB, so the index structure might be well over 200 GB or more. At least 16-32 GB of RAM won’t hurt for caches. I currently do not have any super precise values here.
The DBpedia Lookup only requires around 4-5 GB of RAM when running and 5 GB of Disk Space for the index structures
The DBpedia Spotlight requires around 50 GB of RAM and around 30 GB of Disk Space.
The whole Stack profits from multi-threading, so a lot of CPU cores an threads are helpful here.
Maybe someone can give more precise values but I would suggest
> 128 GB of RAM
Any modern CPU with 8 cores or more
> 500 GB of Disk space
There is no heavy rendering done in the stack, so a strong GPU is not required. If you intend to build your own lookup index structure (which might be required for your chapter), you can either build the index structure on disk (which is slow) or build in memory (a lot faster) - this will however require up to 200 GBs of RAM, so only choose this option if you have that much available.