Currently you either have to import one URL at a time (single patch crawl) or use a spreadsheet to import a large number of URLs. It would be helpful to be add 5, 10, 15 or even 20 URLs in a list.
i.e. the Single patch crawl could give you the option of adding a list of URLs in a row instead of just one. Or keep the single patch crawl and have another option for Multi URL crawl for the number of URLs that the system will cope with. Followed by the bulk patch crawl that relies on the spreadsheet.
Currently you either have to import one URL at a time (single patch crawl) or use a spreadsheet to import a large number of URLs. It would be helpful to be add 5, 10, 15 or even 20 URLs in a list.
i.e. the Single patch crawl could give you the option of adding a list of URLs in a row instead of just one. Or keep the single patch crawl and have another option for Multi URL crawl for the number of URLs that the system will cope with. Followed by the bulk patch crawl that relies on the spreadsheet.