forked from midou/invidious
Compare commits
16 Commits
Author | SHA1 | Date | |
---|---|---|---|
022427e20e | |||
88430a6fc0 | |||
c72b9bea64 | |||
80bc29f3cd | |||
f7125c1204 | |||
6f9056fd84 | |||
3733fe8272 | |||
98bb20abcd | |||
a4d44d3286 | |||
dc358fc7e5 | |||
e14f2f2750 | |||
650b44ade2 | |||
3830604e42 | |||
f83e9e6eb9 | |||
236358d3ad | |||
43d6b65b4f |
24
CHANGELOG.md
24
CHANGELOG.md
@ -1,10 +1,26 @@
|
||||
# 0.8.0 (2018-10-02)
|
||||
|
||||
## Week 8: Mixes
|
||||
|
||||
Hello again!
|
||||
|
||||
Mixes have been added with [`20130db`](https://github.com/omarroth/invidious/20130db), which makes it easy to create a playlist of related content. See [#188](https://github.com/omarroth/invidious/issues/188) for more info on how they work. Currently, they return the first 50 videos rather than a continuous feed to avoid tracking by Google/YouTube, which I think is a good trade-off between usability and privacy, and I hope other folks agree. You can create mixes by adding `RD` to the beginning of a video ID, an example is provided [here](https://www.invidio.us/mix?list=RDYE7VzlLtp-4) based on Big Buck Bunny. I've been quite happy with the results returned for the mixes I've tried, and it is not limited to music, which I think is a big plus. To emulate a continuous feed provided many are used to, using the last video of each mix as a new 'seed' has worked well for me. In the coming week I'd like to to add playback support in the player to listen to these easily.
|
||||
|
||||
A very big thanks to [**@flourgaz**](https://github.com/flourgaz) for Docker support with [#186](https://github.com/omarroth/invidious/pull/186). This is an enormous improvement in portability for the project, and opens the door for Heroku support (see [#162](https://github.com/omarroth/invidious/issues/162)), and seamless support on Windows. For most users, it should be as easy as running `docker-compose up`.
|
||||
|
||||
I've spent quite a bit of time this past week improving support for geo-bypass (see [#92](https://github.com/omarroth/invidious/issues/92)), and am happy to note that Invidious has been able to proxy ~50% of the geo-restricted videos I've tried. In addition, you can now watch geo-restricted videos if you have `dash` enabled as your `preferred quality`, for more details see [#34](https://github.com/omarroth/invidious/issues/34) and [#185](https://github.com/omarroth/invidious/issues/185), or last week's update. For folks interested in replicating these results for themselves, I'd take a look [here](https://gist.github.com/omarroth/3ce0f276c43e0c4b13e7d9cd35524688) for the script used, and [here](https://gist.github.com/omarroth/beffc4a76a7b82a422e1b36a571878ef) for a list of videos restricted in the US.
|
||||
|
||||
1080p has seen a fairly smooth roll-out, although there have been a couple issues reported, mainly [#193](https://github.com/omarroth/invidious/issues/193), which is likely an issue in the player. I've also encountered a couple other issues myself that I would like to investigate. Although none are major, I'd like to keep 1080p opt-in for registered users another week to better address these issues.
|
||||
|
||||
Have an excellent week everyone.
|
||||
|
||||
# 0.7.0 (2018-09-25)
|
||||
|
||||
## Week 7: 1080p and Search Types
|
||||
|
||||
Hello again everyone! I've got quite a couple announcements this week:
|
||||
|
||||
Experimental 1080p support has been added with [`b3ca392`](https://github.com/omarroth/invidious/b3ca392)2a9073b4abb0d7fde58a3e6098668f53e, and can be enabled by going to preferences and changing `preferred video quality` to `dash`. You can find more details [here](https://github.com/omarroth/invidious/issues/34#issuecomment-424171888). Currently quality and speed controls have not yet been integrated into the player, but I'd still appreciate feedback, mainly on any issues with buffering or DASH playback. I hope to integrate 1080p support into the player and push support site-wide in the coming weeks.
|
||||
Experimental 1080p support has been added with [`b3ca392`](https://github.com/omarroth/invidious/b3ca392), and can be enabled by going to preferences and changing `preferred video quality` to `dash`. You can find more details [here](https://github.com/omarroth/invidious/issues/34#issuecomment-424171888). Currently quality and speed controls have not yet been integrated into the player, but I'd still appreciate feedback, mainly on any issues with buffering or DASH playback. I hope to integrate 1080p support into the player and push support site-wide in the coming weeks.
|
||||
|
||||
You can now filter content types in search with the `type:TYPE` filter. Supported content types are `playlist`, `channel`, and `video`. More info is available [here](https://github.com/omarroth/invidious/issues/126#issuecomment-423823148). I think this is quite an improvement in usability and I hope others find the same.
|
||||
|
||||
@ -13,9 +29,9 @@ A [CHANGELOG](https://github.com/omarroth/invidious/blob/master/CHANGELOG.md) ha
|
||||
Recently, users have been reporting 504s when attempting to access their subscriptions, which is tracked in [#173](https://github.com/omarroth/invidious/issues/173). This is most likely caused by an uptick in usage, which I am absolutely grateful for, but unfortunately has resulted in an increase in costs for hosting the site, which is why I will be bumping my goal on Patreon from $60 to $80. I would appreciate any feedback on how subscriptions could be improved.
|
||||
|
||||
Other minor improvements include:
|
||||
- Additional regions added to bypass geo-block with [`9a78523`](https://github.com/omarroth/invidious/9a78523)41d9d67b6bddd8a9836c1b71c124c3614
|
||||
- Fix for playlists containing less than 100 videos (previously shown as empty) with [`35ac887`](https://github.com/omarroth/invidious/35ac887)13320a970e3a87a26249c2a18a709f020
|
||||
- Fix for `published` date for Reddit comments (previously showing negative seconds) with [`6e09202`](https://github.com/omarroth/invidious/6e09202)6d29eccc3e3adf02be138fddec2354027
|
||||
- Additional regions added to bypass geo-block with [`9a78523`](https://github.com/omarroth/invidious/9a78523)
|
||||
- Fix for playlists containing less than 100 videos (previously shown as empty) with [`35ac887`](https://github.com/omarroth/invidious/35ac887)
|
||||
- Fix for `published` date for Reddit comments (previously showing negative seconds) with [`6e09202`](https://github.com/omarroth/invidious/6e09202)
|
||||
|
||||
Thank you everyone for your support!
|
||||
|
||||
|
@ -17,6 +17,11 @@ div {
|
||||
animation: spin 2s linear infinite;
|
||||
}
|
||||
|
||||
.playlist-restricted {
|
||||
height: 20em;
|
||||
padding-right: 10px;
|
||||
}
|
||||
|
||||
/*
|
||||
* Navbar
|
||||
*/
|
||||
|
48
assets/js/watch.js
Normal file
48
assets/js/watch.js
Normal file
@ -0,0 +1,48 @@
|
||||
function toggle_parent(target) {
|
||||
body = target.parentNode.parentNode.children[1];
|
||||
if (body.style.display === null || body.style.display === "") {
|
||||
target.innerHTML = "[ + ]";
|
||||
body.style.display = "none";
|
||||
} else {
|
||||
target.innerHTML = "[ - ]";
|
||||
body.style.display = "";
|
||||
}
|
||||
}
|
||||
|
||||
function toggle_comments(target) {
|
||||
body = target.parentNode.parentNode.parentNode.children[1];
|
||||
if (body.style.display === null || body.style.display === "") {
|
||||
target.innerHTML = "[ + ]";
|
||||
body.style.display = "none";
|
||||
} else {
|
||||
target.innerHTML = "[ - ]";
|
||||
body.style.display = "";
|
||||
}
|
||||
}
|
||||
|
||||
function swap_comments(source) {
|
||||
comments = document.getElementById("comments");
|
||||
var fallback = comments.innerHTML;
|
||||
comments.innerHTML =
|
||||
'<h3><center class="loading"><i class="icon ion-ios-refresh"></i></center></h3>';
|
||||
|
||||
if (source == "youtube") {
|
||||
get_youtube_comments();
|
||||
} else if (source == "reddit") {
|
||||
get_reddit_comments();
|
||||
}
|
||||
}
|
||||
|
||||
function commaSeparateNumber(val) {
|
||||
while (/(\d+)(\d{3})/.test(val.toString())) {
|
||||
val = val.toString().replace(/(\d+)(\d{3})/, "$1" + "," + "$2");
|
||||
}
|
||||
return val;
|
||||
}
|
||||
|
||||
String.prototype.supplant = function(o) {
|
||||
return this.replace(/{([^{}]*)}/g, function(a, b) {
|
||||
var r = o[b];
|
||||
return typeof r === "string" || typeof r === "number" ? r : a;
|
||||
});
|
||||
};
|
@ -9,3 +9,4 @@ db:
|
||||
dbname: invidious
|
||||
full_refresh: false
|
||||
https_only: false
|
||||
geo_bypass: true
|
||||
|
@ -1,5 +1,5 @@
|
||||
name: invidious
|
||||
version: 0.7.0
|
||||
version: 0.8.0
|
||||
|
||||
authors:
|
||||
- Omar Roth <omarroth@hotmail.com>
|
||||
|
123
src/invidious.cr
123
src/invidious.cr
@ -106,12 +106,14 @@ spawn do
|
||||
end
|
||||
|
||||
proxies = {} of String => Array({ip: String, port: Int32})
|
||||
spawn do
|
||||
if CONFIG.geo_bypass
|
||||
spawn do
|
||||
find_working_proxies(BYPASS_REGIONS) do |region, list|
|
||||
if !list.empty?
|
||||
proxies[region] = list
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
before_all do |env|
|
||||
@ -215,6 +217,8 @@ get "/watch" do |env|
|
||||
next env.redirect "/"
|
||||
end
|
||||
|
||||
plid = env.params.query["list"]?
|
||||
|
||||
user = env.get? "user"
|
||||
if user
|
||||
user = user.as(User)
|
||||
@ -235,6 +239,8 @@ get "/watch" do |env|
|
||||
|
||||
begin
|
||||
video = get_video(id, PG_DB, proxies)
|
||||
rescue ex : VideoRedirect
|
||||
next env.redirect "/watch?v=#{ex.message}"
|
||||
rescue ex
|
||||
error_message = ex.message
|
||||
STDOUT << id << " : " << ex.message << "\n"
|
||||
@ -335,6 +341,8 @@ get "/embed/:id" do |env|
|
||||
|
||||
begin
|
||||
video = get_video(id, PG_DB, proxies)
|
||||
rescue ex : VideoRedirect
|
||||
next env.redirect "/embed/#{ex.message}"
|
||||
rescue ex
|
||||
error_message = ex.message
|
||||
next templated "error"
|
||||
@ -400,6 +408,10 @@ get "/playlist" do |env|
|
||||
page = env.params.query["page"]?.try &.to_i?
|
||||
page ||= 1
|
||||
|
||||
if plid.starts_with? "RD"
|
||||
next env.redirect "/mix?list=#{plid}"
|
||||
end
|
||||
|
||||
begin
|
||||
playlist = fetch_playlist(plid)
|
||||
rescue ex
|
||||
@ -1113,12 +1125,14 @@ post "/data_control" do |env|
|
||||
body = JSON.parse(body)
|
||||
body["subscriptions"].as_a.each do |ucid|
|
||||
ucid = ucid.as_s
|
||||
if !user.subscriptions.includes? ucid
|
||||
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE id = $2", ucid, user.id)
|
||||
|
||||
if !user.subscriptions.includes? ucid
|
||||
begin
|
||||
client = make_client(YT_URL)
|
||||
get_channel(ucid, client, PG_DB, false, false)
|
||||
|
||||
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE email = $2", ucid, user.email)
|
||||
user.subscriptions << ucid
|
||||
rescue ex
|
||||
next
|
||||
end
|
||||
@ -1127,8 +1141,10 @@ post "/data_control" do |env|
|
||||
|
||||
body["watch_history"].as_a.each do |id|
|
||||
id = id.as_s
|
||||
|
||||
if !user.watched.includes? id
|
||||
PG_DB.exec("UPDATE users SET watched = array_append(watched,$1) WHERE email = $2", id, user.email)
|
||||
user.watched << id
|
||||
end
|
||||
end
|
||||
|
||||
@ -1139,11 +1155,12 @@ post "/data_control" do |env|
|
||||
ucid = channel["xmlUrl"].match(/UC[a-zA-Z0-9_-]{22}/).not_nil![0]
|
||||
|
||||
if !user.subscriptions.includes? ucid
|
||||
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE email = $2", ucid, user.email)
|
||||
|
||||
begin
|
||||
client = make_client(YT_URL)
|
||||
get_channel(ucid, client, PG_DB, false, false)
|
||||
|
||||
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE email = $2", ucid, user.email)
|
||||
user.subscriptions << ucid
|
||||
rescue ex
|
||||
next
|
||||
end
|
||||
@ -1154,11 +1171,12 @@ post "/data_control" do |env|
|
||||
ucid = md["channel_id"]
|
||||
|
||||
if !user.subscriptions.includes? ucid
|
||||
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE email = $2", ucid, user.email)
|
||||
|
||||
begin
|
||||
client = make_client(YT_URL)
|
||||
get_channel(ucid, client, PG_DB, false, false)
|
||||
|
||||
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE email = $2", ucid, user.email)
|
||||
user.subscriptions << ucid
|
||||
rescue ex
|
||||
next
|
||||
end
|
||||
@ -1170,11 +1188,12 @@ post "/data_control" do |env|
|
||||
ucid = channel["url"].as_s.match(/UC[a-zA-Z0-9_-]{22}/).not_nil![0]
|
||||
|
||||
if !user.subscriptions.includes? ucid
|
||||
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE email = $2", ucid, user.email)
|
||||
|
||||
begin
|
||||
client = make_client(YT_URL)
|
||||
get_channel(ucid, client, PG_DB, false, false)
|
||||
|
||||
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE email = $2", ucid, user.email)
|
||||
user.subscriptions << ucid
|
||||
rescue ex
|
||||
next
|
||||
end
|
||||
@ -1190,19 +1209,24 @@ post "/data_control" do |env|
|
||||
|
||||
db = entry.io.gets_to_end
|
||||
db.scan(/youtube\.com\/watch\?v\=(?<id>[a-zA-Z0-9_-]{11})/) do |md|
|
||||
if !user.watched.includes? md["id"]
|
||||
PG_DB.exec("UPDATE users SET watched = array_append(watched,$1) WHERE email = $2", md["id"], user.email)
|
||||
id = md["id"]
|
||||
|
||||
if !user.watched.includes? id
|
||||
PG_DB.exec("UPDATE users SET watched = array_append(watched,$1) WHERE email = $2", id, user.email)
|
||||
user.watched << id
|
||||
end
|
||||
end
|
||||
|
||||
db.scan(/youtube\.com\/channel\/(?<ucid>[a-zA-Z0-9_-]{22})/) do |md|
|
||||
ucid = md["ucid"]
|
||||
if !user.subscriptions.includes? ucid
|
||||
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE email = $2", ucid, user.email)
|
||||
|
||||
if !user.subscriptions.includes? ucid
|
||||
begin
|
||||
client = make_client(YT_URL)
|
||||
get_channel(ucid, client, PG_DB, false, false)
|
||||
|
||||
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE email = $2", ucid, user.email)
|
||||
user.subscriptions << ucid
|
||||
rescue ex
|
||||
next
|
||||
end
|
||||
@ -1759,6 +1783,8 @@ get "/api/v1/captions/:id" do |env|
|
||||
client = make_client(YT_URL)
|
||||
begin
|
||||
video = get_video(id, PG_DB, proxies)
|
||||
rescue ex : VideoRedirect
|
||||
next env.redirect "/api/v1/captions/#{ex.message}"
|
||||
rescue ex
|
||||
halt env, status_code: 403
|
||||
end
|
||||
@ -1874,6 +1900,8 @@ get "/api/v1/comments/:id" do |env|
|
||||
|
||||
proxies.each do |region, list|
|
||||
spawn do
|
||||
proxy_html = %(<meta itemprop="regionsAllowed" content="">)
|
||||
|
||||
list.each do |proxy|
|
||||
begin
|
||||
proxy_client = HTTPClient.new(YT_URL)
|
||||
@ -1884,10 +1912,10 @@ get "/api/v1/comments/:id" do |env|
|
||||
proxy = HTTPProxy.new(proxy_host: proxy[:ip], proxy_port: proxy[:port])
|
||||
proxy_client.set_proxy(proxy)
|
||||
|
||||
proxy_html = proxy_client.get("/watch?v=#{id}&bpctr=#{Time.new.epoch + 2000}&gl=US&hl=en&disable_polymer=1")
|
||||
response = proxy_client.get("/watch?v=#{id}&bpctr=#{Time.new.epoch + 2000}&gl=US&hl=en&disable_polymer=1")
|
||||
proxy_headers = HTTP::Headers.new
|
||||
proxy_headers["cookie"] = proxy_html.cookies.add_request_headers(headers)["cookie"]
|
||||
proxy_html = proxy_html.body
|
||||
proxy_headers["cookie"] = response.cookies.add_request_headers(headers)["cookie"]
|
||||
proxy_html = response.body
|
||||
|
||||
if proxy_html.match(/<meta itemprop="regionsAllowed" content="">/)
|
||||
bypass_channel.send(nil)
|
||||
@ -1899,6 +1927,11 @@ get "/api/v1/comments/:id" do |env|
|
||||
rescue ex
|
||||
end
|
||||
end
|
||||
|
||||
# If none of the proxies we tried returned a valid response
|
||||
if proxy_html.match(/<meta itemprop="regionsAllowed" content="">/)
|
||||
bypass_channel.send(nil)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
@ -2203,6 +2236,8 @@ get "/api/v1/videos/:id" do |env|
|
||||
|
||||
begin
|
||||
video = get_video(id, PG_DB, proxies)
|
||||
rescue ex : VideoRedirect
|
||||
next env.redirect "/api/v1/videos/#{ex.message}"
|
||||
rescue ex
|
||||
error_message = {"error" => ex.message}.to_json
|
||||
halt env, status_code: 500, response: error_message
|
||||
@ -2906,6 +2941,15 @@ get "/api/v1/playlists/:plid" do |env|
|
||||
page = env.params.query["page"]?.try &.to_i?
|
||||
page ||= 1
|
||||
|
||||
format = env.params.query["format"]?
|
||||
format ||= "json"
|
||||
|
||||
continuation = env.params.query["continuation"]?
|
||||
|
||||
if plid.starts_with? "RD"
|
||||
next env.redirect "/api/v1/mixes/#{plid}"
|
||||
end
|
||||
|
||||
begin
|
||||
playlist = fetch_playlist(plid)
|
||||
rescue ex
|
||||
@ -2914,7 +2958,7 @@ get "/api/v1/playlists/:plid" do |env|
|
||||
end
|
||||
|
||||
begin
|
||||
videos = fetch_playlist_videos(plid, page, playlist.video_count)
|
||||
videos = fetch_playlist_videos(plid, page, playlist.video_count, continuation)
|
||||
rescue ex
|
||||
videos = [] of PlaylistVideo
|
||||
end
|
||||
@ -2973,6 +3017,17 @@ get "/api/v1/playlists/:plid" do |env|
|
||||
end
|
||||
end
|
||||
|
||||
if format == "html"
|
||||
response = JSON.parse(response)
|
||||
playlist_html = template_playlist(response)
|
||||
next_video = response["videos"].as_a[1]?.try &.["videoId"]
|
||||
|
||||
response = {
|
||||
"playlistHtml" => playlist_html,
|
||||
"nextVideo" => next_video,
|
||||
}.to_json
|
||||
end
|
||||
|
||||
response
|
||||
end
|
||||
|
||||
@ -2984,6 +3039,9 @@ get "/api/v1/mixes/:rdid" do |env|
|
||||
continuation = env.params.query["continuation"]?
|
||||
continuation ||= rdid.lchop("RD")
|
||||
|
||||
format = env.params.query["format"]?
|
||||
format ||= "json"
|
||||
|
||||
begin
|
||||
mix = fetch_mix(rdid, continuation)
|
||||
rescue ex
|
||||
@ -3022,6 +3080,17 @@ get "/api/v1/mixes/:rdid" do |env|
|
||||
end
|
||||
end
|
||||
|
||||
if format == "html"
|
||||
response = JSON.parse(response)
|
||||
playlist_html = template_mix(response)
|
||||
next_video = response["videos"].as_a[1]?.try &.["videoId"]
|
||||
|
||||
response = {
|
||||
"playlistHtml" => playlist_html,
|
||||
"nextVideo" => next_video,
|
||||
}.to_json
|
||||
end
|
||||
|
||||
response
|
||||
end
|
||||
|
||||
@ -3045,6 +3114,8 @@ get "/api/manifest/dash/id/:id" do |env|
|
||||
client = make_client(YT_URL)
|
||||
begin
|
||||
video = get_video(id, PG_DB, proxies)
|
||||
rescue ex : VideoRedirect
|
||||
next env.redirect "/api/manifest/dash/id/#{ex.message}"
|
||||
rescue ex
|
||||
halt env, status_code: 403
|
||||
end
|
||||
@ -3408,6 +3479,24 @@ get "/vi/:id/:name" do |env|
|
||||
end
|
||||
|
||||
error 404 do |env|
|
||||
if md = env.request.path.match(/^\/(?<id>[a-zA-Z0-9_-]{11})/)
|
||||
id = md["id"]
|
||||
|
||||
params = [] of String
|
||||
env.params.query.each do |k, v|
|
||||
params << "#{k}=#{v}"
|
||||
end
|
||||
params = params.join("&")
|
||||
|
||||
url = "/watch?v=#{id}"
|
||||
if !params.empty?
|
||||
url += "&#{params}"
|
||||
end
|
||||
|
||||
env.response.headers["Location"] = url
|
||||
halt env, status_code: 302
|
||||
end
|
||||
|
||||
error_message = "404 Page not found"
|
||||
templated "error"
|
||||
end
|
||||
|
@ -104,22 +104,24 @@ def template_youtube_comments(comments)
|
||||
|
||||
html += <<-END_HTML
|
||||
<div class="pure-g">
|
||||
<div class="pure-u-2-24">
|
||||
<div class="pure-u-4-24 pure-u-md-2-24">
|
||||
<img style="width:90%; padding-right:1em; padding-top:1em;" src="#{author_thumbnail}">
|
||||
</div>
|
||||
<div class="pure-u-22-24">
|
||||
<div class="pure-u-20-24 pure-u-md-22-24">
|
||||
<p>
|
||||
<a href="javascript:void(0)" onclick="toggle(this)">[ - ]</a>
|
||||
<i class="icon ion-ios-thumbs-up"></i> #{child["likeCount"]}
|
||||
<b><a href="#{child["authorUrl"]}">#{child["author"]}</a></b>
|
||||
- #{recode_date(Time.epoch(child["published"].as_i64))} ago
|
||||
</p>
|
||||
<a href="javascript:void(0)" onclick="toggle_parent(this)">[ - ]</a>
|
||||
<b>
|
||||
<a href="#{child["authorUrl"]}">#{child["author"]}</a>
|
||||
</b>
|
||||
<div>
|
||||
<p style="white-space:pre-wrap">#{child["contentHtml"]}</p>
|
||||
#{recode_date(Time.epoch(child["published"].as_i64))} ago
|
||||
|
|
||||
<i class="icon ion-ios-thumbs-up"></i> #{child["likeCount"]}
|
||||
</p>
|
||||
#{replies_html}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
END_HTML
|
||||
end
|
||||
|
||||
@ -156,10 +158,10 @@ def template_reddit_comments(root)
|
||||
|
||||
content = <<-END_HTML
|
||||
<p>
|
||||
<a href="javascript:void(0)" onclick="toggle(this)">[ - ]</a>
|
||||
<i class="icon ion-ios-thumbs-up"></i> #{score}
|
||||
<a href="javascript:void(0)" onclick="toggle_parent(this)">[ - ]</a>
|
||||
<b><a href="https://www.reddit.com/user/#{author}">#{author}</a></b>
|
||||
- #{recode_date(child.created_utc)} ago
|
||||
#{score} points
|
||||
#{recode_date(child.created_utc)} ago
|
||||
</p>
|
||||
<div>
|
||||
#{body_html}
|
||||
|
@ -14,6 +14,7 @@ class Config
|
||||
https_only: Bool?,
|
||||
hmac_key: String?,
|
||||
full_refresh: Bool,
|
||||
geo_bypass: Bool,
|
||||
})
|
||||
end
|
||||
|
||||
|
@ -93,6 +93,25 @@ def get_proxies(country_code = "US")
|
||||
return get_nova_proxies(country_code)
|
||||
end
|
||||
|
||||
def filter_proxies(proxies)
|
||||
proxies.select! do |proxy|
|
||||
begin
|
||||
client = HTTPClient.new(YT_URL)
|
||||
client.read_timeout = 10.seconds
|
||||
client.connect_timeout = 10.seconds
|
||||
|
||||
proxy = HTTPProxy.new(proxy_host: proxy[:ip], proxy_port: proxy[:port])
|
||||
client.set_proxy(proxy)
|
||||
|
||||
client.head("/").status_code == 200
|
||||
rescue ex
|
||||
false
|
||||
end
|
||||
end
|
||||
|
||||
return proxies
|
||||
end
|
||||
|
||||
def get_nova_proxies(country_code = "US")
|
||||
country_code = country_code.downcase
|
||||
client = HTTP::Client.new(URI.parse("https://www.proxynova.com"))
|
||||
@ -127,7 +146,7 @@ def get_nova_proxies(country_code = "US")
|
||||
proxies << {ip: ip, port: port, score: score}
|
||||
end
|
||||
|
||||
proxies = proxies.sort_by { |proxy| proxy[:score] }.reverse
|
||||
# proxies = proxies.sort_by { |proxy| proxy[:score] }.reverse
|
||||
return proxies
|
||||
end
|
||||
|
||||
|
@ -156,39 +156,14 @@ def update_decrypt_function
|
||||
end
|
||||
|
||||
def find_working_proxies(regions)
|
||||
proxy_channel = Channel({String, Array({ip: String, port: Int32})}).new
|
||||
|
||||
loop do
|
||||
regions.each do |region|
|
||||
spawn do
|
||||
loop do
|
||||
begin
|
||||
proxies = get_proxies(region).first(20)
|
||||
rescue ex
|
||||
next proxy_channel.send({region, Array({ip: String, port: Int32}).new})
|
||||
end
|
||||
|
||||
proxies.select! do |proxy|
|
||||
begin
|
||||
client = HTTPClient.new(YT_URL)
|
||||
client.read_timeout = 10.seconds
|
||||
client.connect_timeout = 10.seconds
|
||||
|
||||
proxy = HTTPProxy.new(proxy_host: proxy[:ip], proxy_port: proxy[:port])
|
||||
client.set_proxy(proxy)
|
||||
|
||||
client.get("/").status_code == 200
|
||||
rescue ex
|
||||
false
|
||||
end
|
||||
end
|
||||
proxies = proxies.map { |proxy| {ip: proxy[:ip], port: proxy[:port]} }
|
||||
# proxies = filter_proxies(proxies)
|
||||
|
||||
proxy_channel.send({region, proxies})
|
||||
yield region, proxies
|
||||
Fiber.yield
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
loop do
|
||||
yield proxy_channel.receive
|
||||
end
|
||||
end
|
||||
|
@ -6,6 +6,7 @@ class MixVideo
|
||||
ucid: String,
|
||||
length_seconds: Int32,
|
||||
index: Int32,
|
||||
mixes: Array(String),
|
||||
})
|
||||
end
|
||||
|
||||
@ -34,6 +35,10 @@ def fetch_mix(rdid, video_id, cookies = nil)
|
||||
raise "Could not create mix."
|
||||
end
|
||||
|
||||
if !yt_data["contents"]["twoColumnWatchNextResults"]["playlist"]?
|
||||
raise "Could not create mix."
|
||||
end
|
||||
|
||||
playlist = yt_data["contents"]["twoColumnWatchNextResults"]["playlist"]["playlist"]
|
||||
mix_title = playlist["title"].as_s
|
||||
|
||||
@ -59,7 +64,8 @@ def fetch_mix(rdid, video_id, cookies = nil)
|
||||
author,
|
||||
ucid,
|
||||
length_seconds,
|
||||
index
|
||||
index,
|
||||
[rdid]
|
||||
)
|
||||
end
|
||||
|
||||
@ -72,3 +78,37 @@ def fetch_mix(rdid, video_id, cookies = nil)
|
||||
videos = videos.first(50)
|
||||
return Mix.new(mix_title, rdid, videos)
|
||||
end
|
||||
|
||||
def template_mix(mix)
|
||||
html = <<-END_HTML
|
||||
<h3>
|
||||
<a href="/mix?list=#{mix["mixId"]}">
|
||||
#{mix["title"]}
|
||||
</a>
|
||||
</h3>
|
||||
<div class="pure-menu pure-menu-scrollable playlist-restricted">
|
||||
<ol class="pure-menu-list">
|
||||
END_HTML
|
||||
|
||||
mix["videos"].as_a.each do |video|
|
||||
html += <<-END_HTML
|
||||
<li class="pure-menu-item">
|
||||
<a href="/watch?v=#{video["videoId"]}&list=#{mix["mixId"]}">
|
||||
<img style="width:100%;" src="/vi/#{video["videoId"]}/mqdefault.jpg">
|
||||
<p style="width:100%">#{video["title"]}</p>
|
||||
<p>
|
||||
<b style="width: 100%">#{video["author"]}</b>
|
||||
</p>
|
||||
</a>
|
||||
</li>
|
||||
END_HTML
|
||||
end
|
||||
|
||||
html += <<-END_HTML
|
||||
</ol>
|
||||
</div>
|
||||
<hr>
|
||||
END_HTML
|
||||
|
||||
html
|
||||
end
|
||||
|
@ -26,11 +26,23 @@ class Playlist
|
||||
})
|
||||
end
|
||||
|
||||
def fetch_playlist_videos(plid, page, video_count)
|
||||
def fetch_playlist_videos(plid, page, video_count, continuation = nil)
|
||||
client = make_client(YT_URL)
|
||||
|
||||
if video_count > 100
|
||||
if continuation
|
||||
html = client.get("/watch?v=#{continuation}&list=#{plid}&bpctr=#{Time.new.epoch + 2000}&gl=US&hl=en&disable_polymer=1")
|
||||
html = XML.parse_html(html.body)
|
||||
|
||||
index = html.xpath_node(%q(//span[@id="playlist-current-index"])).try &.content.to_i?
|
||||
if index
|
||||
index -= 1
|
||||
end
|
||||
index ||= 0
|
||||
else
|
||||
index = (page - 1) * 100
|
||||
end
|
||||
|
||||
if video_count > 100
|
||||
url = produce_playlist_url(plid, index)
|
||||
|
||||
response = client.get(url)
|
||||
@ -199,3 +211,37 @@ def fetch_playlist(plid)
|
||||
|
||||
return playlist
|
||||
end
|
||||
|
||||
def template_playlist(playlist)
|
||||
html = <<-END_HTML
|
||||
<h3>
|
||||
<a href="/playlist?list=#{playlist["playlistId"]}">
|
||||
#{playlist["title"]}
|
||||
</a>
|
||||
</h3>
|
||||
<div class="pure-menu pure-menu-scrollable playlist-restricted">
|
||||
<ol class="pure-menu-list">
|
||||
END_HTML
|
||||
|
||||
playlist["videos"].as_a.each do |video|
|
||||
html += <<-END_HTML
|
||||
<li class="pure-menu-item">
|
||||
<a href="/watch?v=#{video["videoId"]}&list=#{playlist["playlistId"]}">
|
||||
<img style="width:100%;" src="/vi/#{video["videoId"]}/mqdefault.jpg">
|
||||
<p style="width:100%">#{video["title"]}</p>
|
||||
<p>
|
||||
<b style="width: 100%">#{video["author"]}</b>
|
||||
</p>
|
||||
</a>
|
||||
</li>
|
||||
END_HTML
|
||||
end
|
||||
|
||||
html += <<-END_HTML
|
||||
</ol>
|
||||
</div>
|
||||
<hr>
|
||||
END_HTML
|
||||
|
||||
html
|
||||
end
|
||||
|
@ -477,6 +477,9 @@ class CaptionName
|
||||
)
|
||||
end
|
||||
|
||||
class VideoRedirect < Exception
|
||||
end
|
||||
|
||||
def get_video(id, db, proxies = {} of String => Array({ip: String, port: Int32}), refresh = true)
|
||||
if db.query_one?("SELECT EXISTS (SELECT true FROM videos WHERE id = $1)", id, as: Bool)
|
||||
video = db.query_one("SELECT * FROM videos WHERE id = $1", id, as: Video)
|
||||
@ -511,14 +514,18 @@ def get_video(id, db, proxies = {} of String => Array({ip: String, port: Int32})
|
||||
end
|
||||
|
||||
def fetch_video(id, proxies)
|
||||
html_channel = Channel(XML::Node).new
|
||||
html_channel = Channel(XML::Node | String).new
|
||||
info_channel = Channel(HTTP::Params).new
|
||||
|
||||
spawn do
|
||||
client = make_client(YT_URL)
|
||||
html = client.get("/watch?v=#{id}&bpctr=#{Time.new.epoch + 2000}&gl=US&hl=en&disable_polymer=1")
|
||||
html = XML.parse_html(html.body)
|
||||
|
||||
if md = html.headers["location"]?.try &.match(/v=(?<id>[a-zA-Z0-9_-]{11})/)
|
||||
next html_channel.send(md["id"])
|
||||
end
|
||||
|
||||
html = XML.parse_html(html.body)
|
||||
html_channel.send(html)
|
||||
end
|
||||
|
||||
@ -536,6 +543,11 @@ def fetch_video(id, proxies)
|
||||
end
|
||||
|
||||
html = html_channel.receive
|
||||
if html.as?(String)
|
||||
raise VideoRedirect.new("#{html.as(String)}")
|
||||
end
|
||||
html = html.as(XML::Node)
|
||||
|
||||
info = info_channel.receive
|
||||
|
||||
if info["reason"]? && info["reason"].includes? "your country"
|
||||
@ -543,6 +555,10 @@ def fetch_video(id, proxies)
|
||||
|
||||
proxies.each do |region, list|
|
||||
spawn do
|
||||
info = HTTP::Params.new({
|
||||
"reason" => [info["reason"]],
|
||||
})
|
||||
|
||||
list.each do |proxy|
|
||||
begin
|
||||
client = HTTPClient.new(YT_URL)
|
||||
@ -563,6 +579,11 @@ def fetch_video(id, proxies)
|
||||
rescue ex
|
||||
end
|
||||
end
|
||||
|
||||
# If none of the proxies we tried returned a valid response
|
||||
if info["reason"]?
|
||||
bypass_channel.send(nil)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
|
@ -32,7 +32,7 @@
|
||||
<p><%= number_with_separator(item.video_count) %> videos</p>
|
||||
<p>PLAYLIST</p>
|
||||
<% when MixVideo %>
|
||||
<a style="width:100%;" href="/watch?v=<%= item.id %>">
|
||||
<a style="width:100%;" href="/watch?v=<%= item.id %>&list=<%= item.mixes[0] %>">
|
||||
<% if env.get?("user") && env.get("user").as(User).preferences.thin_mode %>
|
||||
<% else %>
|
||||
<img style="width:100%;" src="/vi/<%= item.id %>/mqdefault.jpg"/>
|
||||
|
@ -13,13 +13,13 @@
|
||||
<div class="pure-g h-box">
|
||||
<div class="pure-u-1 pure-u-md-1-5">
|
||||
<% if page >= 2 %>
|
||||
<a href="/search?q=<%= query %>&page=<%= page - 1 %>">Previous page</a>
|
||||
<a href="/search?q=<%= HTML.escape(query.not_nil!) %>&page=<%= page - 1 %>">Previous page</a>
|
||||
<% end %>
|
||||
</div>
|
||||
<div class="pure-u-1 pure-u-md-3-5"></div>
|
||||
<div style="text-align:right;" class="pure-u-1 pure-u-md-1-5">
|
||||
<% if count >= 20 %>
|
||||
<a href="/search?q=<%= query %>&page=<%= page + 1 %>">Next page</a>
|
||||
<a href="/search?q=<%= HTML.escape(query.not_nil!) %>&page=<%= page + 1 %>">Next page</a>
|
||||
<% end %>
|
||||
</div>
|
||||
</div>
|
||||
|
@ -22,6 +22,7 @@
|
||||
<meta name="twitter:player" content="<%= host_url %>/embed/<%= video.id %>">
|
||||
<meta name="twitter:player:width" content="1280">
|
||||
<meta name="twitter:player:height" content="720">
|
||||
<script src="/js/watch.js"></script>
|
||||
<%= rendered "components/player_sources" %>
|
||||
<title><%= HTML.escape(video.title) %> - Invidious</title>
|
||||
<% end %>
|
||||
@ -122,6 +123,13 @@
|
||||
</div>
|
||||
</div>
|
||||
<div class="pure-u-1 pure-u-md-1-5">
|
||||
<% if plid %>
|
||||
<div id="playlist" class="h-box">
|
||||
<h3><center class="loading"><i class="icon ion-ios-refresh"></i></center></h3>
|
||||
<hr>
|
||||
</div>
|
||||
<% end %>
|
||||
|
||||
<% if !preferences || preferences && preferences.related_videos %>
|
||||
<div class="h-box">
|
||||
<% rvs.each do |rv| %>
|
||||
@ -144,61 +152,61 @@
|
||||
</div>
|
||||
|
||||
<script>
|
||||
function toggle(target) {
|
||||
body = target.parentNode.parentNode.children[1];
|
||||
if (body.style.display === null || body.style.display === "") {
|
||||
target.innerHTML = "[ + ]";
|
||||
body.style.display = "none";
|
||||
<% if plid %>
|
||||
function get_playlist() {
|
||||
var plid = "<%= plid %>"
|
||||
|
||||
if (plid.startsWith("RD")) {
|
||||
var plid_url = "/api/v1/mixes/<%= plid %>?continuation=<%= video.id %>&format=html";
|
||||
} else {
|
||||
target.innerHTML = "[ - ]";
|
||||
body.style.display = "";
|
||||
var plid_url = "/api/v1/playlists/<%= plid %>?continuation=<%= video.id %>&format=html";
|
||||
}
|
||||
}
|
||||
|
||||
function toggle_comments(target) {
|
||||
body = target.parentNode.parentNode.parentNode.children[1];
|
||||
if (body.style.display === null || body.style.display === "") {
|
||||
target.innerHTML = "[ + ]";
|
||||
body.style.display = "none";
|
||||
} else {
|
||||
target.innerHTML = "[ - ]";
|
||||
body.style.display = "";
|
||||
}
|
||||
}
|
||||
|
||||
function get_youtube_replies(target) {
|
||||
var continuation = target.getAttribute("data-continuation");
|
||||
|
||||
var body = target.parentNode.parentNode;
|
||||
var fallback = body.innerHTML;
|
||||
body.innerHTML =
|
||||
'<h3><center class="loading"><i class="icon ion-ios-refresh"></i></center></h3>';
|
||||
|
||||
var url =
|
||||
"/api/v1/comments/<%= video.id %>?format=html&continuation=" + continuation;
|
||||
var xhr = new XMLHttpRequest();
|
||||
xhr.responseType = "json";
|
||||
xhr.timeout = 20000;
|
||||
xhr.open("GET", url, true);
|
||||
xhr.open("GET", plid_url, true);
|
||||
xhr.send();
|
||||
|
||||
xhr.onreadystatechange = function() {
|
||||
if (xhr.readyState == 4) {
|
||||
if (xhr.status == 200) {
|
||||
body.innerHTML = xhr.response.contentHtml;
|
||||
playlist = document.getElementById("playlist");
|
||||
playlist.innerHTML = xhr.response.playlistHtml;
|
||||
|
||||
if (xhr.response.nextVideo) {
|
||||
player.on('ended', function() {
|
||||
window.location.replace("/watch?v="
|
||||
+ xhr.response.nextVideo
|
||||
+ "&list=<%= plid %>"
|
||||
<% if params[:listen] %>
|
||||
+ "&listen=1"
|
||||
<% end %>
|
||||
<% if params[:autoplay] %>
|
||||
+ "&autoplay=1"
|
||||
<% end %>
|
||||
);
|
||||
});
|
||||
}
|
||||
} else {
|
||||
body.innerHTML = fallback;
|
||||
playlist.innerHTML = "";
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
xhr.ontimeout = function() {
|
||||
console.log("Pulling comments timed out.");
|
||||
console.log("Pulling playlist timed out.");
|
||||
|
||||
body.innerHTML = fallback;
|
||||
comments = document.getElementById("playlist");
|
||||
comments.innerHTML =
|
||||
'<h3><center class="loading"><i class="icon ion-ios-refresh"></i></center></h3><hr>';
|
||||
get_playlist();
|
||||
};
|
||||
}
|
||||
|
||||
get_playlist();
|
||||
<% end %>
|
||||
|
||||
function get_reddit_comments() {
|
||||
var url = "/api/v1/comments/<%= video.id %>?source=reddit&format=html";
|
||||
var xhr = new XMLHttpRequest();
|
||||
@ -208,7 +216,7 @@ function get_reddit_comments() {
|
||||
xhr.send();
|
||||
|
||||
xhr.onreadystatechange = function() {
|
||||
if (xhr.readyState == 4)
|
||||
if (xhr.readyState == 4) {
|
||||
if (xhr.status == 200) {
|
||||
comments = document.getElementById("comments");
|
||||
comments.innerHTML = ' \
|
||||
@ -217,6 +225,13 @@ function get_reddit_comments() {
|
||||
<a href="javascript:void(0)" onclick="toggle_comments(this)">[ - ]</a> \
|
||||
{title} \
|
||||
</h3> \
|
||||
<p> \
|
||||
<b> \
|
||||
<a href="javascript:void(0)" onclick="swap_comments(\'youtube\')"> \
|
||||
View YouTube comments \
|
||||
</a> \
|
||||
</b> \
|
||||
</p> \
|
||||
<b> \
|
||||
<a rel="noopener" target="_blank" href="https://reddit.com{permalink}">View more comments on Reddit</a> \
|
||||
</b> \
|
||||
@ -235,6 +250,7 @@ function get_reddit_comments() {
|
||||
comments.innerHTML = "";
|
||||
<% end %>
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
xhr.ontimeout = function() {
|
||||
@ -253,7 +269,7 @@ function get_youtube_comments() {
|
||||
xhr.send();
|
||||
|
||||
xhr.onreadystatechange = function() {
|
||||
if (xhr.readyState == 4)
|
||||
if (xhr.readyState == 4) {
|
||||
if (xhr.status == 200) {
|
||||
comments = document.getElementById("comments");
|
||||
if (xhr.response.commentCount > 0) {
|
||||
@ -263,6 +279,11 @@ function get_youtube_comments() {
|
||||
<a href="javascript:void(0)" onclick="toggle_comments(this)">[ - ]</a> \
|
||||
View {commentCount} comments \
|
||||
</h3> \
|
||||
<b> \
|
||||
<a href="javascript:void(0)" onclick="swap_comments(\'reddit\')"> \
|
||||
View Reddit comments \
|
||||
</a> \
|
||||
</b> \
|
||||
</div> \
|
||||
<div>{contentHtml}</div> \
|
||||
<hr>'.supplant({
|
||||
@ -280,6 +301,7 @@ function get_youtube_comments() {
|
||||
comments.innerHTML = "";
|
||||
<% end %>
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
xhr.ontimeout = function() {
|
||||
@ -292,19 +314,38 @@ function get_youtube_comments() {
|
||||
};
|
||||
}
|
||||
|
||||
function commaSeparateNumber(val){
|
||||
while (/(\d+)(\d{3})/.test(val.toString())){
|
||||
val = val.toString().replace(/(\d+)(\d{3})/, '$1'+','+'$2');
|
||||
}
|
||||
return val;
|
||||
}
|
||||
function get_youtube_replies(target) {
|
||||
var continuation = target.getAttribute('data-continuation');
|
||||
|
||||
String.prototype.supplant = function(o) {
|
||||
return this.replace(/{([^{}]*)}/g, function(a, b) {
|
||||
var r = o[b];
|
||||
return typeof r === "string" || typeof r === "number" ? r : a;
|
||||
});
|
||||
};
|
||||
var body = target.parentNode.parentNode;
|
||||
var fallback = body.innerHTML;
|
||||
body.innerHTML =
|
||||
'<h3><center class="loading"><i class="icon ion-ios-refresh"></i></center></h3>';
|
||||
|
||||
var url = '/api/v1/comments/<%= video.id %>?format=html&continuation=' +
|
||||
continuation;
|
||||
var xhr = new XMLHttpRequest();
|
||||
xhr.responseType = 'json';
|
||||
xhr.timeout = 20000;
|
||||
xhr.open('GET', url, true);
|
||||
xhr.send();
|
||||
|
||||
xhr.onreadystatechange = function() {
|
||||
if (xhr.readyState == 4) {
|
||||
if (xhr.status == 200) {
|
||||
body.innerHTML = xhr.response.contentHtml;
|
||||
} else {
|
||||
body.innerHTML = fallback;
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
xhr.ontimeout = function() {
|
||||
console.log('Pulling comments timed out.');
|
||||
|
||||
body.innerHTML = fallback;
|
||||
};
|
||||
}
|
||||
|
||||
<% if preferences %>
|
||||
<% if preferences.comments[0] == "youtube" %>
|
||||
|
Reference in New Issue
Block a user