WebDL: ABC iView and SBS Downloader
WebDL is a collection of Web TV downloader scripts with a consistent user interface. I’ve previously released these separately, but a while ago I refactored them to share common code and packaged them into a single utility. You can use this interactively or to download any shows matching a glob from a cronjob. Currently supported are ABC iView and SBS OnDemand. I’ll probably add more in the future.
Update 2015-05-24: Please see the Bitbucket project for up to date docs!
Update 2015-05-24: Fixed SBS and Channel 9. Livestreamer is now a required dependency.
Update 2014-07-22: Added notes on version to dependencies.
Update 2013-03-26: The latest version of autograbber.py now accepts a file with a list of patterns instead of taking them from the command line.
Update 2014-02-15: Please see https://bitbucket.org/delx/webdl for bug reports or to post patches.
Dependencies
- Livestreamer
- python (2.7, not python 3)
- python-lxml
- rtmpdump a1900c3e15
- ffmpeg / libav
The versions listed above are what I have success using. In particular note that rtmpdump always reports v2.4 even though there have been many binaries built with different bugs and features using that version number. If something doesn’t work, try compiling a new ffmpeg/avconv or rtmpdump to see if it fixes the problem.
Interactive Usage
You can run WebDL interactively to browse categories and episode lists and download TV episodes.
$ ./grabber.py 1) ABC iView 2) SBS 0) Back Choose> 1 1) ABC 4 Kids 2) Arts & Culture 3) Comedy 4) Documentary <snipped> Choose> 4 1) ABC Open Series 2012 2) Art Of Germany 3) Baby Beauty Queens 4) Catalyst Series 13 <snipped> Choose> 4 1) Catalyst Series 13 Episode 15 2) Catalyst Series 13 Episode 16 0) Back Choose> 1 RTMPDump v2.3 (c) 2010 Andrej Stepanchuk, Howard Chu, The Flvstreamer Team; license: GPL Connecting ... INFO: Connected... Starting download at: 0.000 kB
The bolded parts are what you type. Note that you can go back on any screen by typing “0”. At the list of episodes you can download a single episode by typing one number, or multiple episodes by typing several numbers separated by spaces.
Cron Scripted Usage
I have a shell script which looks something like this, I run it daily from crontab.
# m h dom mon dow command 0 1 * * * ./autograbber.py /path/to/video-dir/ /path/to/patterns.txt
The patterns.txt file should contain shell-style globs, something like:
ABC iView/*/QI*/* SBS/Programs/Documentary/*/*
The above will download all episodes of QI from ABC as well as every SBS documentary. Whenever an episode is downloaded it is recorded into downloaded_auto.txt. Even if you move the files somewhere else they will not be redownloaded.
Just downloaded this to my mipsel enigma2 stb… it all works except p7… very impressed.
Some of the downloads are only fair quality… but then I don’t want to max out my hdd. Would be good to have some higher quality options if that was possible.
Excellent work and good luck with the pesky p7:)
I also had a few hiccups with iview (see Aidan) but I put it down to their site.
SBS is broken somehow. I end up with an mp4 of plausible size; VLC recognises its duration but won’t play it (no error, just doesn’t play) and won’t show any media or codec data. Windows Media player won’t play it either.
BTW the file is rather larger than I would have expected compared to the ABC ones which I’m used to. Is this an SBS compression issue, or are you hard-coding picking up SBS HD?
This is with the latest and greatest code.
Everything works except iview which gives the error:
RTMPDump v2.5 GIT-2012-03-31 (Handshake 10 support by Xeebo)
(c) 2010 Andrej Stepanchuk, Howard Chu, The Flvstreamer Team; license: GPL
rtmpdump: symbol lookup error: rtmpdump: undefined symbol: InjectFlashPlayerRtmpe10HandshakeCode
rtmpdump exited with error code: 127
I used to use python-iview, and it has the same error. Is there a better version of RTMPDump?
I rolled back to RTMPDump 2.4 and now iview downloads! SBS continues to work very well. Thank you for this project!
Nice on 10.6.8 – was using macport’s rtmpdump @ 2.3, but works much better for nine/ten under rtmpdump 2.4 (not sure why macports aren’t providing 2.4 – it’s a pretty easy Makefile edit?)
Thanks.
This is fantastic! Just what I’ve been searching for, a non-Windows dependent solution that actually works AND is consistently updated. Can’t thank you enough :D
I’ve only tried downloading a few shows from SBS and ABC so far and all seems well. Only question is regarding the HD iView content people are talking about above – is it just certain shows, and is it working at the moment? Doesn’t seem to be for me but maybe it’s just the stuff I’ve been grabbing. Any tips?
Thanks again!
From memory iView now also streams content using Adobe HDS. I haven’t spent much time working out how to grab this yet. It seems like somebody has already done a lot of the hard work here: https://github.com/K-S-V/Scripts/blob/master/AdobeHDS.php
Cool.. Wish my programming skills went any further than ‘hello world’ :P
Think you will be able to incorporate this at some stage? I’d be more than happy to throw some beer money your way via PayPal or something as a thankyou. Great work, regardless!
If it helps at all, just found a few things that might be useful:
http://stream-recorder.com/forum/adobe-hds-downloader-t14823p13.html
https://github.com/k3c/Scripts/blob/master/AdobeHDS.py
Has something changed for SBS recently, now seems broken?
I did upgrade recently, maybe its my end?
Traceback (most recent call last):
File “./grabber.py”, line 56, in
main()
File “./grabber.py”, line 36, in main
for n in node.get_children():
File “/home/johnb/scripts/flvgrabber/v7/common.py”, line 43, in get_children
self.fill_children()
File “/home/johnb/scripts/flvgrabber/v7/sbs.py”, line 77, in fill_children
menu = grab_json(VIDEO_MENU, 3600, skip_assignment=True)
File “/home/johnb/scripts/flvgrabber/v7/common.py”, line 138, in grab_json
doc = json.loads(text[pos+1:])
File “/usr/lib/python2.7/json/__init__.py”, line 326, in loads
return _default_decoder.decode(s)
File “/usr/lib/python2.7/json/decoder.py”, line 366, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File “/usr/lib/python2.7/json/decoder.py”, line 382, in raw_decode
obj, end = self.scan_once(s, idx)
ValueError: Expecting : delimiter: line 1 column 128733 (char 128733)
I have the same problem with SBS.
Same here for SBS and I haven’t upgraded anything. :(
I get the same thing for SBS. I don’t really know what’s going on, but there seems to be some corruption in the downloaded json file fetched from the SBS website. I’ve been able to work around the fault by replacing the cached file with one that I manually download using wget, like so:
$ U=http://www.sbs.com.au/ondemand/js/video-menu
$ FN=~/.cache/webdl/$(printf “$U” | md5sum | { read A B; echo $A; })
$ wget -O $FN $U
This grabs the webpage using wget, and overwrites the cache version. The next time you run grabber.py in the next hour, it should use the cached version. The cache file expires after an hour, and a new one will be downloaded, which would probably fail again, so the above commands would need to be re-run.
I can’t work out why the python download gets this corruption but wget does not. I’m not a python programmer, so someone else might be able to provide some insight. The corruption always seems to be within the last hundred or so bytes of the end.
OK, so I’ve spliced in the call to wget into the program with the following patch:
It’s a really hacky work-around, but seems to be working OK for me.
Brilliant Jez – confirmed working here also.
Note for OS X users – you may need to add this to you ~/.bash_profile or similar to get the above to work:
alias md5=’md5 -r’
alias md5sum=’md5 -r’
Thanks for the temporary fix, I can’t live without If You Are The One ;)
Not so fast! That patch forces the use of wget for all web fetching, which breaks anything that requires cookies or referrers. So this patch does the wget only for the SBS URL that’s causing the problem.
With regard to the SBS issue, I only managed to reproduce it once. Right now running the following commands reliably gives the same hash.
$ python -c 'import hashlib, urllib; print hashlib.md5(urllib.urlopen("http://www.sbs.com.au/ondemand/js/video-menu").read()).hexdigest()'
$ curl -s http://www.sbs.com.au/ondemand/js/video-menu | md5sum -
Yeah, me too. I cannot make it fail anymore. Very weird. I’m pulling out my work-around and I’ll see how it goes over the next few days.
FYI, this is the corruption I saw. Below is a diff between two copies of the downloaded file, but with newlines inserted after every closing brace.
$ diff v1a v2a
302,304c302
< ,{"id":"1515","pid":"1285","name":"World Cup 2010","thumbnail":"http:\/\/videocdn.sbs.com.au\/u\/video\/p\/thumbnails\/WC2010_EXT_Socceroos_review_Par_242_279237.jpg","thumbnailLarge":"http:\/\/videocdn.sbs.com.au\/u\/video\/p\/slates\/WC2010_EXT_Socceroos_review_Par_242_279237.jpg","furl":"","formFilter":"0","df":"0","clickable":"1","url":"\/api\/video_feed\/f\/Bgtm9B\/sbs-section-clips?form=json&byCategories=Sport%2FFootball&byCustomValue=%7Bgrouping%7D%7BWorld+Cup+2010%7D"}
< ]}
,{“id”:”1515″,”pid”:”1285″,”name”:”World Cup 2010″,”thumbnail”:”http:\/\/videocdn.sbs.com.au\/u\/video\/p\/thumbnails\/WC2010_EXT_Socceroos_review_Par_242_279237.jpg”,”thumbnailLarge”:”http:\/\/videocdn.sbs.com.au\/u\/video\/p\/slates\/WC2010_EXT_Socceroos_review_Par_242_279237.jpg”,”fuа▒_â–’â–’@â–’â–’_RqFâ–’â–’â–’â–’â–’â–’â–’â–’’DDn3TqFtm9B\/sbs-section-clips?form=json&byCateâ–’â–’â–’_port%2FFMqFbyCustomValue=%7â–’â–’_8â–’â–’_+201Xâ–’â–’_ing Soon”:{“id”:”1287″,”pid”:”1,”name”:”Coming â–’â–’â–’_â–’â–’â–’_â–’â–’â–’â–’â–’â–’â–’â–’â–’â–’_.sbs.com.au\â–’â–’â–’_BS\/ 787\/48859203693_0919-smallâ–’â–’â–’_â–’ ~ye”:”httpâ–’â–’â–’_s.com.auâ–’â–’â–’_eo\/SBS\â–’â–’â–’_x\/48859OqFe.jp,”ex:”comingsoon”,”formFilter”:”0″,”df”:”0″,”clickable”:”1″,”url”:”\/api\/video_feed\/f\/Bgtm9B\/sbs-video-comingsoon?form=json&byCategories=Section%2FPromos%2CChannel%2FSBS1%7CChannel%2FSBS2″}
Is this Index Error for SBS new?
Just updated, cleared cache etc.
SBS, Masters Of Sex ep 1 or 2, Robot Chicken fine.
Yeah, even more weirder. Failing again, but only occasionally. :-(
@AET, I think that’s geoblocking. If I run webdl/grabber.py from Australia without any interesting routing then it works fine. I’ve pushed an update which will give a more useful error message.
Hi Delx,
It works in the browser and I am with Westnet/Iinet, so I hope not.
after update, crunch. Anything I can help with?
Happy to try manually on the code. e,g, what lines do you change to echo/prin and run the command manually??
Other SBS shows working alright.
Just the Masters of Sex series.
Choose> 63
Traceback (most recent call last):
File “./grabber.py”, line 56, in
main()
File “./grabber.py”, line 49, in main
if not n.download():
File “/root/webdl/webdl/sbs.py”, line 50, in download
ext = urlparse.urlsplit(video_url).path.rsplit(“.”, 1)[1]
IndexError: list index out of range
SBS is just completely broken for me. wget trick doesn’t help, pulling the latest update didn’t help either. No weird routing is involved. I am always getting an error like this:
Traceback (most recent call last):
File “./grabber.py”, line 56, in
main()
File “./grabber.py”, line 36, in main
for n in node.get_children():
File “/Users/samburg/webdl/common.py”, line 43, in get_children
self.fill_children()
File “/Users/samburg/webdl/sbs.py”, line 81, in fill_children
menu = grab_json(VIDEO_MENU, 3600, skip_assignment=True)
File “/Users/samburg/webdl/common.py”, line 138, in grab_json
doc = json.loads(text[pos+1:])
File “/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/__init__.py”, line 326, in loads
return _default_decoder.decode(s)
File “/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/decoder.py”, line 360, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File “/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/decoder.py”, line 376, in raw_decode
obj, end = self.scan_once(s, idx)
ValueError: Invalid control character at: line 1 column 128739 (char 128739)
Looks to me like pretty much the same error we started getting a few days ago but at that stage, the wget workaround fixed it for me – now it doesn’t. :(
@Sam, first, to see if this is the same problem (which I’m confident
it is), check your cache file for control characters. Do this before
running the grabber again, as the cache files are overwritten by the
grabber after an hour.
grep -c '[[:cntrl:]]' ~/.cache/webdl/bb24a0ea23f67ae29fc167d98f4a9216
Report back what you find. This will count the number of lines with
control characters, and should be zero, but becomes one when this
problem arises.
Then run the grabber again and see if it works. If it’s been more
than an hour, then a newly downloaded file might be uncorrupt.
If not, remove the cache file and try running the grabber again:
rm ~/.cache/webdl/bb24a0ea23f67ae29fc167d98f4a9216
It’s working for me as of right now.
Finally got a chance to try this. When I checked for control characters I did indeed get 1. Ran the grabber again and now all is working fine. So, good news!
Weirdest thing though is that I had tried clearing the cache previously and that still didn’t work. I guess SBS has just been patchy lately and I’ve had worse luck than some because I happened to be trying at exactly the wrong times..?
Anyway – thanks for the advice :)
@AET
I looked into the Masters of Sex on SBS problem. It seems SBS only provides this video using Adobe HDS. This is the same protocol that Channel 7 now exclusively uses. It’s also used by iView, but is still optional.
It seems to be getting quite popular, so I’ll have to investigate writing some code for it at some point.
so I’ll have to investigate writing some code for it at some point.
Yes please! If you can get HDS going, does that mean we’ll be able to fetch Ch7 shows again?
Anything we can do to help? My Python skills are modest, but if I can help, please let me know.
Ditto on the help.
Just looking into the problem, video downloadhelper on firefox lights up when viewing on demand and starts retreiving the files into 1mb chunks which are then reassembled to a viewable stream. seems to run around 10-15 chunks in front. I fancy they have busted this elsewhere.
Hi,
Has anyone managed to extract the PLUS7 brightcove token key? I have found links to an xml feed (http://au.syndication.yahoo.com/iptv/autv-plus7-shows-samsung.xml) but i want to connect directly to the brightcove JSON feed to get their play ID’s.
Have spent some time searching through source and can’t seem to find it. 10 and 9 where quite easy to find.
Any help Plz ?
Hi there!
This program seems to work quite well once properly set up (rtmpdump et al.), except for ABC24 / ABC News live streaming. The first time I tried this (I saw “ABC24” in the channel listing) rtmpdump fell over with some kind of authentication token error; the ABC24 listing has now disappeared from the channel view (some weeks/months later; I didn’t upgrade the or change the program though! lol), and when I select “ABC News 24 Live Stream” (from the only possible method now available, ABC > Genre > News > News 24 Live Stream) I am greeted with the following Python crash:
Traceback (most recent call last):
File “grabber.py”, line 56, in
main()
File “grabber.py”, line 36, in main
for n in node.get_children():
File “/home/i336/webdl/common.py”, line 43, in get_children
self.fill_children()
File “/home/i336/webdl/iview.py”, line 41, in fill_children
series_doc = grab_json(self.params[“api”] + “series=” + self.series_id, 3600)
File “/home/i336/webdl/common.py”, line 145, in grab_json
doc = json.load(f)
File “/usr/lib/python2.7/json/__init__.py”, line 278, in load
**kw)
File “/usr/lib/python2.7/json/__init__.py”, line 326, in loads
return _default_decoder.decode(s)
File “/usr/lib/python2.7/json/decoder.py”, line 366, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File “/usr/lib/python2.7/json/decoder.py”, line 384, in raw_decode
Not sure what’s going on here, but the live news feeds are kind of the most interesting attribute I’m most interested in here.
If you could possibly explain what’s going on and/or if it’s fixable, that’d be awesome.
Just thought I’d let you know. Thanks :)
-i336
@i336, sorry, the live streaming is not supported at the moment.
I see. It works using a different system perhaps? Is the authentication system different? Just curious :P
Thinking about it a bit it *is* live, but rtmpdump is designed to save stuff, which is more suited for catch-up TV viewing. FYI, I personally wouldn’t mind dumping live TV though (if rtmpdump isn’t utterly incompatible at some fundamental level) and just playing the dump manually a few seconds after starting the stream – I’d have infinite rewind :D
-i336
@i336, I’ve not looked into this in too much depth, but it seems iView just doesn’t return any data for the live stream.
Compare:
http://iview.abc.net.au/api/legacy/flash/?series=9900018 (News 24)
http://iview.abc.net.au/api/legacy/flash/?series=2365989 (QI)
Do you have a bug/ticket logging system?
@Pete, no sorry.
Anyone having trouble with SBS?
Downloading: How Small Is The Universe.f4m
Traceback (most recent call last):
File “./grabber.py”, line 56, in
main()
File “./grabber.py”, line 49, in main
if not n.download():
File “/root/webdl/webdl/sbs.py”, line 53, in download
return download_urllib(filename, video_url, referrer=SWF_URL)
File “/root/webdl/webdl/common.py”, line 228, in download_urllib
src = _urlopen(url, referrer)
File “/root/webdl/webdl/common.py”, line 83, in _urlopen
return urlopener.open(req)
File “/usr/lib64/python2.6/urllib2.py”, line 397, in open
response = meth(req, response)
File “/usr/lib64/python2.6/urllib2.py”, line 510, in http_response
‘http’, request, response, code, msg, hdrs)
File “/usr/lib64/python2.6/urllib2.py”, line 435, in error
return self._call_chain(*args)
File “/usr/lib64/python2.6/urllib2.py”, line 369, in _call_chain
result = func(*args)
File “/usr/lib64/python2.6/urllib2.py”, line 518, in http_error_default
raise HTTPError(req.get_full_url(), code, msg, hdrs, fp)
urllib2.HTTPError: HTTP Error 403: Forbidden
Looks like something else has died. I just checked on two different linux machines and something has broken. The error message is …
Traceback (most recent call last):
File “./autograbber.py”, line 65, in
main(destdir, patternfile)
File “./autograbber.py”, line 55, in main
match(download_list, node, search)
File “./autograbber.py”, line 45, in match
match(download_list, child, pattern, count+1)
File “./autograbber.py”, line 45, in match
match(download_list, child, pattern, count+1)
File “./autograbber.py”, line 45, in match
match(download_list, child, pattern, count+1)
File “./autograbber.py”, line 45, in match
match(download_list, child, pattern, count+1)
File “./autograbber.py”, line 43, in match
for child in node.get_children():
File “/home/ljw/iviewdl/webdl/common.py”, line 43, in get_children
self.fill_children()
File “/home/ljw/iviewdl/webdl/iview.py”, line 45, in fill_children
vpath = episode[“n”]
KeyError: ‘n’
Traceback (most recent call last):
File “./autograbber.py”, line 65, in
main(destdir, patternfile)
File “./autograbber.py”, line 55, in main
match(download_list, node, search)
File “./autograbber.py”, line 45, in match
match(download_list, child, pattern, count+1)
File “./autograbber.py”, line 45, in match
match(download_list, child, pattern, count+1)
File “./autograbber.py”, line 45, in match
match(download_list, child, pattern, count+1)
File “./autograbber.py”, line 45, in match
match(download_list, child, pattern, count+1)
File “./autograbber.py”, line 43, in match
for child in node.get_children():
File “/home/ljw/iviewdl/webdl/common.py”, line 43, in get_children
self.fill_children()
File “/home/ljw/iviewdl/webdl/iview.py”, line 45, in fill_children
vpath = episode[“n”]
KeyError: ‘n’
Fixed iView. SBS is starting to use Adobe HDS, which I have not written support for yet.
I was still using iViewFox 1.4 ’till 2 weeks ago when it stopped working. I’m hoping that you guys can get RTMP dump working again for ABC iView..
Cheers SC
Awesome! It’s working again.
Firstly, thanks for all you great work and continuing effort. Ive been quietly grateful for some time now.
FYI, today, both ABC and SBS have broken (for me):
SBS:
File “/usr/lib/python2.7/urllib2.py”, line 527, in http_error_default raise HTTPError(req.get_full_url(), code, msg, hdrs, fp)
urllib2.HTTPError: HTTP Error 403: Forbidden
ABC:
File “/usr/lib/python2.7/socket.py”, line 447, in readline data = self._sock.recv(self._rbufsize)
socket.error: [Errno 104] Connection reset by peer
Thanx
Feature request: a key command (p?) to print out the current folder location, to make it easier to write
the correct path for the patterns file when using autograbber
Thanks
Looks like it’s broken again on ABC. (SBS seems ok though.)
Choose> 0
1) ABC iView
2) Nine
3) SBS
4) Ten
5) Yahoo Plus7
0) Back
Choose> 1
Traceback (most recent call last):
File “D:\Documents and Settings\xxxx\My Documents\delx-webdl-833418ded83b\grabber.py”, line 56, in
main()
File “D:\Documents and Settings\xxxx\My Documents\delx-webdl-833418ded83b\grabber.py”, line 36, in main
for n in node.get_children():
File “D:\Documents and Settings\xxxx\My Documents\delx-webdl-833418ded83b\common.py”, line 43, in get_children
self.fill_children()
File “D:\Documents and Settings\xxxx\My Documents\delx-webdl-833418ded83b\iview.py”, line 115, in fill_children
self.load_series()
File “D:\Documents and Settings\xxxx\My Documents\delx-webdl-833418ded83b\iview.py”, line 79, in load_series
series_list_doc = grab_json(self.params[“api”] + “seriesIndex”, 3600)
File “D:\Documents and Settings\xxxx\My Documents\delx-webdl-833418ded83b\common.py”, line 134, in grab_json
f = urlopen(url, max_age)
File “D:\Documents and Settings\xxxx\My Documents\delx-webdl-833418ded83b\common.py”, line 100, in urlopen
src = _urlopen(url)
File “D:\Documents and Settings\xxxx\My Documents\delx-webdl-833418ded83b\common.py”, line 83, in _urlopen
return urlopener.open(req)
File “C:\Python27\lib\urllib2.py”, line 406, in open
response = meth(req, response)
File “C:\Python27\lib\urllib2.py”, line 519, in http_response
‘http’, request, response, code, msg, hdrs)
File “C:\Python27\lib\urllib2.py”, line 444, in error
return self._call_chain(*args)
File “C:\Python27\lib\urllib2.py”, line 378, in _call_chain
result = func(*args)
File “C:\Python27\lib\urllib2.py”, line 527, in http_error_default
raise HTTPError(req.get_full_url(), code, msg, hdrs, fp)
urllib2.HTTPError: HTTP Error 500: Internal Server Error
>
(I’ve just added this as a bitbucket issue. I hope that’s ok.)
Hmm, just a glitch. Ok now (as James notes).
Not exactly an issue with WebDL itself, but I’ve not yet found anything that can play the resulting downloaded mp4 files from SBS. The ABC ones play fine with (for example) VLC, Windows Media Player and KMPlayer, but none of these can cope with the SBS ones. Any tips, anyone?
I just downloaded sn SBS program: Stephen Hawkings Future Universe Ep4 – Perfect City. I was able to play it back with both VLC and Windows Media Player without problems.
Downloaded from Nine but couldn’t play back (I wonder whether the file is encrypted?)
Ten seems okay. Seven still lacks a brightcove token. ABC seems fine.
Actually now I try it again I’m reminded that it’s the convert that fails:
File “D:\Documents and Settings\Joe\My Documents\delx-webdl-833418ded83b\common.py”, line 176, in convert_flv_mp4
os.rename(orig_filename, flv_filename)
WindowsError: [Error 32] The process cannot access the file because it is being used by another process
I have an open bitbucket issue so James is trying to help.
Looks like fun!
There’s a good “how to” for OS X at http://thetitleiseverything.com/blog/2013/9/9/download-catch-up-tv
Anyone know if there’s a similar one for Windows 7?
Hi James,
Can’t seem to install ffmeg on Mavericks
sudo port install ffmpeg
Fails with…
Password:
—> Computing dependencies for ffmpeg
—> Dependencies to be installed: x264
—> Configuring x264
Error: Failed to configure x264, consult /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_multimedia_x264/x264/work/x264-956c8d8/config.log
Error: org.macports.configure for port x264 returned: configure failure: command execution failed
Error: Failed to install x264
Please see the log file for port x264 for details:
/opt/local/var/macports/logs/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_multimedia_x264/x264/main.log
Error: The following dependencies were not installed: x264
To report a bug, follow the instructions in the guide:
http://guide.macports.org/#project.tickets
Error: Processing of port ffmpeg failed
bills-imac-226:~ billcooper$ sudo port install ffmpeg
—> Computing dependencies for ffmpeg
—> Dependencies to be installed: x264
—> Configuring x264
Error: Failed to configure x264, consult /opt/local/var/macports/build/_opt_loca
SO now when I try to download, it look good until the actual download. I get
Downloading: Orphan Black S2 Ep1 – Nature Under Constraint And Vexed.f4m
Traceback (most recent call last):
File “/Users/myname/Documents/TV Temp/webdl/grabber.py”, line 56, in
main()
File “/Users/billcooper/Documents/TV Temp/webdl/grabber.py”, line 49, in main
if not n.download():
File “/Users/mynameDocuments/TV Temp/webdl/sbs.py”, line 55, in download
return download_urllib(filename, video_url, referrer=SWF_URL)
File “/Users/mynameDocuments/TV Temp/webdl/common.py”, line 228, in download_urllib
src = _urlopen(url, referrer)
File “/Users/myname/Documents/TV Temp/webdl/common.py”, line 83, in _urlopen
return urlopener.open(req)
File “/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py”, line 410, in open
response = meth(req, response)
File “/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py”, line 523, in http_response
‘http’, request, response, code, msg, hdrs)
File “/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py”, line 448, in error
return self._call_chain(*args)
File “/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py”, line 382, in _call_chain
result = func(*args)
File “/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py”, line 531, in http_error_default
raise HTTPError(req.get_full_url(), code, msg, hdrs, fp)
urllib2.HTTPError: HTTP Error 403: Forbidden