Looking for way to read table content in HTML file from Python3
I am looking for reading NIFTY 50 Low and High values from the below web page in Python3.
https://www.nseindia.com/live_market/dynaContent/live_watch/live_index_watch.htm
I tried using bs4 and selenium webdrivers to read these values. Could you please let me know how can I read them?
Regards,
Ram
html python-3.x
add a comment |
I am looking for reading NIFTY 50 Low and High values from the below web page in Python3.
https://www.nseindia.com/live_market/dynaContent/live_watch/live_index_watch.htm
I tried using bs4 and selenium webdrivers to read these values. Could you please let me know how can I read them?
Regards,
Ram
html python-3.x
1
try beautifulsoup.
– Dan Farrell
Dec 28 '18 at 0:49
@DanFarrell bs4 is beautifulsoup, which OP mentioned they have tried already
– davedwards
Dec 28 '18 at 1:25
@Ram, how did you try using bs4 and selenium? can you show us your code? then we can provide suggestions or solutions.
– davedwards
Dec 28 '18 at 2:53
add a comment |
I am looking for reading NIFTY 50 Low and High values from the below web page in Python3.
https://www.nseindia.com/live_market/dynaContent/live_watch/live_index_watch.htm
I tried using bs4 and selenium webdrivers to read these values. Could you please let me know how can I read them?
Regards,
Ram
html python-3.x
I am looking for reading NIFTY 50 Low and High values from the below web page in Python3.
https://www.nseindia.com/live_market/dynaContent/live_watch/live_index_watch.htm
I tried using bs4 and selenium webdrivers to read these values. Could you please let me know how can I read them?
Regards,
Ram
html python-3.x
html python-3.x
asked Dec 28 '18 at 0:47
Ram
32
32
1
try beautifulsoup.
– Dan Farrell
Dec 28 '18 at 0:49
@DanFarrell bs4 is beautifulsoup, which OP mentioned they have tried already
– davedwards
Dec 28 '18 at 1:25
@Ram, how did you try using bs4 and selenium? can you show us your code? then we can provide suggestions or solutions.
– davedwards
Dec 28 '18 at 2:53
add a comment |
1
try beautifulsoup.
– Dan Farrell
Dec 28 '18 at 0:49
@DanFarrell bs4 is beautifulsoup, which OP mentioned they have tried already
– davedwards
Dec 28 '18 at 1:25
@Ram, how did you try using bs4 and selenium? can you show us your code? then we can provide suggestions or solutions.
– davedwards
Dec 28 '18 at 2:53
1
1
try beautifulsoup.
– Dan Farrell
Dec 28 '18 at 0:49
try beautifulsoup.
– Dan Farrell
Dec 28 '18 at 0:49
@DanFarrell bs4 is beautifulsoup, which OP mentioned they have tried already
– davedwards
Dec 28 '18 at 1:25
@DanFarrell bs4 is beautifulsoup, which OP mentioned they have tried already
– davedwards
Dec 28 '18 at 1:25
@Ram, how did you try using bs4 and selenium? can you show us your code? then we can provide suggestions or solutions.
– davedwards
Dec 28 '18 at 2:53
@Ram, how did you try using bs4 and selenium? can you show us your code? then we can provide suggestions or solutions.
– davedwards
Dec 28 '18 at 2:53
add a comment |
1 Answer
1
active
oldest
votes
Without seeing your code with Selenium and bs4 we don't know why it didn't work for you. But this code seems to work:
import bs4
import requests
from selenium import webdriver
url = 'https://www.nseindia.com/live_market/dynaContent/live_watch/live_index_watch.htm'
options = webdriver.ChromeOptions()
options.add_argument('headless') # disable Chrome browser GUI interface
driver = webdriver.Chrome(r'path_to_chromedriver.exe', options=options)
driver.get(url)
soup = bs4.BeautifulSoup(driver.page_source, 'html.parser')
table = soup.find('table', id='liveIndexWatch') # get the first table
nifty_50_row = table.find_all('tr')[2] # get first row of prices
high_low = nifty_50_row.find_all('td')[4:6] # get 'high'/'low' columns
# format output
print('NIFTY 50 High: {h} Low: {l}'.format(h=high_low[0].text, l=high_low[1].text))
prints:
NIFTY 50 High: 10,834.20 Low: 10,764.45 # current values
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53952478%2flooking-for-way-to-read-table-content-in-html-file-from-python3%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
Without seeing your code with Selenium and bs4 we don't know why it didn't work for you. But this code seems to work:
import bs4
import requests
from selenium import webdriver
url = 'https://www.nseindia.com/live_market/dynaContent/live_watch/live_index_watch.htm'
options = webdriver.ChromeOptions()
options.add_argument('headless') # disable Chrome browser GUI interface
driver = webdriver.Chrome(r'path_to_chromedriver.exe', options=options)
driver.get(url)
soup = bs4.BeautifulSoup(driver.page_source, 'html.parser')
table = soup.find('table', id='liveIndexWatch') # get the first table
nifty_50_row = table.find_all('tr')[2] # get first row of prices
high_low = nifty_50_row.find_all('td')[4:6] # get 'high'/'low' columns
# format output
print('NIFTY 50 High: {h} Low: {l}'.format(h=high_low[0].text, l=high_low[1].text))
prints:
NIFTY 50 High: 10,834.20 Low: 10,764.45 # current values
add a comment |
Without seeing your code with Selenium and bs4 we don't know why it didn't work for you. But this code seems to work:
import bs4
import requests
from selenium import webdriver
url = 'https://www.nseindia.com/live_market/dynaContent/live_watch/live_index_watch.htm'
options = webdriver.ChromeOptions()
options.add_argument('headless') # disable Chrome browser GUI interface
driver = webdriver.Chrome(r'path_to_chromedriver.exe', options=options)
driver.get(url)
soup = bs4.BeautifulSoup(driver.page_source, 'html.parser')
table = soup.find('table', id='liveIndexWatch') # get the first table
nifty_50_row = table.find_all('tr')[2] # get first row of prices
high_low = nifty_50_row.find_all('td')[4:6] # get 'high'/'low' columns
# format output
print('NIFTY 50 High: {h} Low: {l}'.format(h=high_low[0].text, l=high_low[1].text))
prints:
NIFTY 50 High: 10,834.20 Low: 10,764.45 # current values
add a comment |
Without seeing your code with Selenium and bs4 we don't know why it didn't work for you. But this code seems to work:
import bs4
import requests
from selenium import webdriver
url = 'https://www.nseindia.com/live_market/dynaContent/live_watch/live_index_watch.htm'
options = webdriver.ChromeOptions()
options.add_argument('headless') # disable Chrome browser GUI interface
driver = webdriver.Chrome(r'path_to_chromedriver.exe', options=options)
driver.get(url)
soup = bs4.BeautifulSoup(driver.page_source, 'html.parser')
table = soup.find('table', id='liveIndexWatch') # get the first table
nifty_50_row = table.find_all('tr')[2] # get first row of prices
high_low = nifty_50_row.find_all('td')[4:6] # get 'high'/'low' columns
# format output
print('NIFTY 50 High: {h} Low: {l}'.format(h=high_low[0].text, l=high_low[1].text))
prints:
NIFTY 50 High: 10,834.20 Low: 10,764.45 # current values
Without seeing your code with Selenium and bs4 we don't know why it didn't work for you. But this code seems to work:
import bs4
import requests
from selenium import webdriver
url = 'https://www.nseindia.com/live_market/dynaContent/live_watch/live_index_watch.htm'
options = webdriver.ChromeOptions()
options.add_argument('headless') # disable Chrome browser GUI interface
driver = webdriver.Chrome(r'path_to_chromedriver.exe', options=options)
driver.get(url)
soup = bs4.BeautifulSoup(driver.page_source, 'html.parser')
table = soup.find('table', id='liveIndexWatch') # get the first table
nifty_50_row = table.find_all('tr')[2] # get first row of prices
high_low = nifty_50_row.find_all('td')[4:6] # get 'high'/'low' columns
# format output
print('NIFTY 50 High: {h} Low: {l}'.format(h=high_low[0].text, l=high_low[1].text))
prints:
NIFTY 50 High: 10,834.20 Low: 10,764.45 # current values
edited Dec 28 '18 at 2:57
answered Dec 28 '18 at 2:39
davedwards
5,22021131
5,22021131
add a comment |
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53952478%2flooking-for-way-to-read-table-content-in-html-file-from-python3%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
try beautifulsoup.
– Dan Farrell
Dec 28 '18 at 0:49
@DanFarrell bs4 is beautifulsoup, which OP mentioned they have tried already
– davedwards
Dec 28 '18 at 1:25
@Ram, how did you try using bs4 and selenium? can you show us your code? then we can provide suggestions or solutions.
– davedwards
Dec 28 '18 at 2:53