test=pd.read_csv(’/Users/jappanjeetsingh/Downloads/Test/ID.csv’).values
jokes=[]
for id in test:
response=requests.get(“http://api.icndb.com/jokes/{}”.format(int(id)))
joke=json.loads(response.content)[‘value’][‘joke’]
jokes.append(joke)
Everytime i run this piece of code it is not giving output and keeps on running
hey @LP18Aug0041,
Your code is correct and also working as required , as you using an online API then retrieving that data depends upon your internet speed. SO , if any time , our internet speed is slow , the jokes retrieval will be slow too , and hence it will take time. That’s the only reason due to which your codes keep on running.
I hope this would have resolved your doubt.
Thank You and Happy Learning .
But prashant i have a very fast internet speed around 40mbps.
If it is this reason only that you mentioned how can i make my submission?
i have run your code on google colab .
I tried some code to check what speed i get on google colab , it showed me 1.47 gbps , and we both know it isn’t as such. But whatever might be the speed , it must be greater than 400 mbps.
With such speed it took me 3 mins to get the data retrieved and such scraping on google colab is also always faster then our local system.
So , i guess this is only the reason due to which your code is taking so much time.
To do it much faster ,
just scrap all data from http://api.incdb.com/jokes ( without using any particular id ) , it will provide you with all jokes .
You just have to perform formatting and preprocessing the way you need it to be done. It will be a lot faster then previous.
I hope this would help you.
Thank You and Happy Learning. .
ok prashant i ll try toscrape using bs4 . Thank you so much for the help!
that code ran. It took around 3 minutes thanks for the help!!