Merge branch 'main' into hotfix/proxy

pull/1371/head
ValueRaider 2023-07-17 18:29:04 +01:00 committed by GitHub
commit 056b84d8fe
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
39 changed files with 2712 additions and 1974 deletions

View File

@ -9,7 +9,7 @@ assignees: ''
# IMPORTANT
If you want help, you got to read this first, follow the instructions.
# Read and follow these instructions carefully. Help us help you.
### Are you up-to-date?
@ -25,19 +25,20 @@ and comparing against [PIP](https://pypi.org/project/yfinance/#history).
### Does Yahoo actually have the data?
Are you spelling ticker *exactly* same as Yahoo?
Are you spelling symbol *exactly* same as Yahoo?
Then visit `finance.yahoo.com` and confirm they have the data you want. Maybe your ticker was delisted, or your expectations of `yfinance` are wrong.
Then visit `finance.yahoo.com` and confirm they have the data you want. Maybe your symbol was delisted, or your expectations of `yfinance` are wrong.
### Are you spamming Yahoo?
Yahoo Finance free service has rate-limiting depending on request type - roughly 60/minute for prices, 10/minute for info. Once limit hit, Yahoo can delay, block, or return bad data. Not a `yfinance` bug.
Yahoo Finance free service has rate-limiting depending on request type - roughly 60/minute for prices, 10/minute for info. Once limit hit, Yahoo can delay, block, or return bad data -> not a `yfinance` bug.
### Still think it's a bug?
Delete this default message (all of it) and submit your bug report here, providing the following as best you can:
**Delete these instructions** and replace with your bug report, providing the following as best you can:
- Simple code that reproduces your problem, that we can copy-paste-run
- Exception message with full traceback, or proof `yfinance` returning bad data
- `yfinance` version and Python version
- Operating system type
- Simple code that reproduces your problem, that we can copy-paste-run.
- Run code with [debug logging enabled](https://github.com/ranaroussi/yfinance#logging) and post the full output.
- If you think `yfinance` returning bad data, give us proof.
- `yfinance` version and Python version.
- Operating system type.

View File

@ -1,14 +0,0 @@
---
name: Feature request
about: Request a new feature
title: ''
labels: ''
assignees: ''
---
**Describe the problem**
**Describe the solution**
**Additional context**

View File

@ -13,9 +13,9 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v2
uses: actions/setup-python@v4
with:
python-version: '3.x'
- name: Install dependencies

View File

@ -1,6 +1,68 @@
Change Log
===========
0.2.24
------
Fix info[] missing values #1603
0.2.23
------
Fix 'Unauthorized' error #1595
0.2.22
------
Fix unhandled 'sqlite3.DatabaseError' #1574
0.2.21
------
Fix financials tables #1568
Price repair update: fix Yahoo messing up dividend and split adjustments #1543
Fix logging behaviour #1562
Fix merge future div/split into prices #1567
0.2.20
------
Switch to `logging` module #1493 #1522 #1541
Price history:
- optimise #1514
- fixes #1523
- fix TZ-cache corruption #1528
0.2.18
------
Fix 'fast_info' error '_np not found' #1496
Fix bug in timezone cache #1498
0.2.17
------
Fix prices error with Pandas 2.0 #1488
0.2.16
------
Fix 'fast_info deprecated' msg appearing at Ticker() init
0.2.15
------
Restore missing Ticker.info keys #1480
0.2.14
------
Fix Ticker.info dict by fetching from API #1461
0.2.13
------
Price bug fixes:
- fetch big-interval with Capital Gains #1455
- merging dividends & splits with prices #1452
0.2.12
------
Disable annoying 'backup decrypt' msg
0.2.11
------
Fix history_metadata accesses for unusual symbols #1411
0.2.10
------
General

View File

@ -42,11 +42,6 @@ Yahoo! finance API is intended for personal use only.**
---
## News [2023-01-27]
Since December 2022 Yahoo has been encrypting the web data that `yfinance` scrapes for non-market data. Fortunately the decryption keys are available, although Yahoo moved/changed them several times hence `yfinance` breaking several times. `yfinance` is now better prepared for any future changes by Yahoo.
Why is Yahoo doing this? We don't know. Is it to stop scrapers? Maybe, so we've implemented changes to reduce load on Yahoo. In December we rolled out version 0.2 with optimised scraping. Then in 0.2.6 introduced `Ticker.fast_info`, providing much faster access to some `info` elements wherever possible e.g. price stats and forcing users to switch (sorry but we think necessary). `info` will continue to exist for as long as there are elements without a fast alternative.
## Quick Start
### The Ticker module
@ -58,10 +53,8 @@ import yfinance as yf
msft = yf.Ticker("MSFT")
# get all stock info (slow)
# get all stock info
msft.info
# fast access to subset of stock info (opportunistic)
msft.fast_info
# get historical market data
hist = msft.history(period="1mo")
@ -76,9 +69,6 @@ msft.splits
msft.capital_gains # only for mutual funds & etfs
# show share count
# - yearly summary:
msft.shares
# - accurate time-series count:
msft.get_shares_full(start="2022-01-01", end=None)
# show financials:
@ -98,25 +88,6 @@ msft.major_holders
msft.institutional_holders
msft.mutualfund_holders
# show earnings
msft.earnings
msft.quarterly_earnings
# show sustainability
msft.sustainability
# show analysts recommendations
msft.recommendations
msft.recommendations_summary
# show analysts other work
msft.analyst_price_target
msft.revenue_forecasts
msft.earnings_forecasts
msft.earnings_trend
# show next event (earnings, etc)
msft.calendar
# Show future and historic earnings dates, returns at most next 4 quarters and last 8 quarters by default.
# Note: If more are needed use msft.get_earnings_dates(limit=XX) with increased limit argument.
msft.earnings_dates
@ -154,6 +125,8 @@ msft.option_chain(..., proxy="PROXY_SERVER")
...
```
### Multiple tickers
To initialize multiple `Ticker` objects, use
```python
@ -167,24 +140,18 @@ tickers.tickers['AAPL'].history(period="1mo")
tickers.tickers['GOOG'].actions
```
### Fetching data for multiple tickers
To download price history into one table:
```python
import yfinance as yf
data = yf.download("SPY AAPL", start="2017-01-01", end="2017-04-30")
data = yf.download("SPY AAPL", period="1mo")
```
`yf.download()` and `Ticker.history()` have many options for configuring fetching and processing, e.g.:
#### `yf.download()` and `Ticker.history()` have many options for configuring fetching and processing. [Review the Wiki](https://github.com/ranaroussi/yfinance/wiki) for more options and detail.
```python
yf.download(tickers = "SPY AAPL", # list of tickers
period = "1y", # time period
interval = "1d", # trading interval
ignore_tz = True, # ignore timezone when aligning data from different exchanges?
prepost = False) # download pre/post market hours data?
```
### Logging
Review the [Wiki](https://github.com/ranaroussi/yfinance/wiki) for more options and detail.
`yfinance` now uses the `logging` module to handle messages, default behaviour is only print errors. If debugging, use `yf.enable_debug_mode()` to switch logging to debug with custom formatting.
### Smarter scraping
@ -206,11 +173,12 @@ Combine a `requests_cache` with rate-limiting to avoid triggering Yahoo's rate-l
from requests import Session
from requests_cache import CacheMixin, SQLiteCache
from requests_ratelimiter import LimiterMixin, MemoryQueueBucket
from pyrate_limiter import Duration, RequestRate, Limiter
class CachedLimiterSession(CacheMixin, LimiterMixin, Session):
""" """
pass
session = CachedLimiterSession(
per_second=0.9,
limiter=Limiter(RequestRate(2, Duration.SECOND*5)), # max 2 requests per 5 seconds
bucket_class=MemoryQueueBucket,
backend=SQLiteCache("yfinance.cache"),
)
@ -231,21 +199,7 @@ yfinance?](https://stackoverflow.com/questions/63107801)
- How to download single or multiple tickers into a single
dataframe with single level column names and a ticker column
### Timezone cache store
When fetching price data, all dates are localized to stock exchange timezone.
But timezone retrieval is relatively slow, so yfinance attemps to cache them
in your users cache folder.
You can direct cache to use a different location with `set_tz_cache_location()`:
```python
import yfinance as yf
yf.set_tz_cache_location("custom/cache/location")
...
```
---
## `pandas_datareader` override
### `pandas_datareader` override
If your code uses `pandas_datareader` and you want to download data
faster, you can "hijack" `pandas_datareader.data.get_data_yahoo()`
@ -262,6 +216,18 @@ yf.pdr_override() # <== that's all it takes :-)
data = pdr.get_data_yahoo("SPY", start="2017-01-01", end="2017-04-30")
```
### Timezone cache store
When fetching price data, all dates are localized to stock exchange timezone.
But timezone retrieval is relatively slow, so yfinance attemps to cache them
in your users cache folder.
You can direct cache to use a different location with `set_tz_cache_location()`:
```python
import yfinance as yf
yf.set_tz_cache_location("custom/cache/location")
...
```
---
## Installation
@ -272,6 +238,11 @@ Install `yfinance` using `pip`:
$ pip install yfinance --upgrade --no-cache-dir
```
Test new features by installing betas, provide feedback in [corresponding Discussion](https://github.com/ranaroussi/yfinance/discussions):
``` {.sourceCode .bash}
$ pip install yfinance --upgrade --no-cache-dir --pre
```
To install `yfinance` using `conda`, see
[this](https://anaconda.org/ranaroussi/yfinance).
@ -289,11 +260,15 @@ To install `yfinance` using `conda`, see
- [html5lib](https://pypi.org/project/html5lib) \>= 1.1
- [cryptography](https://pypi.org/project/cryptography) \>= 3.3.2
### Optional (if you want to use `pandas_datareader`)
#### Optional (if you want to use `pandas_datareader`)
- [pandas\_datareader](https://github.com/pydata/pandas-datareader)
\>= 0.4.0
## Developers: want to contribute?
`yfinance` relies on community to investigate bugs and contribute code. Developer guide: https://github.com/ranaroussi/yfinance/discussions/1084
---
### Legal Stuff

View File

@ -1,5 +1,5 @@
{% set name = "yfinance" %}
{% set version = "0.2.10" %}
{% set version = "0.2.24" %}
package:
name: "{{ name|lower }}"

View File

@ -63,9 +63,8 @@ setup(
'requests>=2.26', 'multitasking>=0.0.7',
'lxml>=4.9.1', 'appdirs>=1.4.4', 'pytz>=2022.5',
'frozendict>=2.3.4',
# 'pycryptodome>=3.6.6',
'cryptography>=3.3.2',
'beautifulsoup4>=4.11.1', 'html5lib>=1.1'],
# Note: Pandas.read_html() needs html5lib & beautifulsoup4
entry_points={
'console_scripts': [
'sample=sample:main',

View File

@ -15,6 +15,9 @@ Sanity check for most common library uses all working
import yfinance as yf
import unittest
import logging
logging.basicConfig(level=logging.DEBUG)
symbols = ['MSFT', 'IWO', 'VFINX', '^GSPC', 'BTC-USD']
tickers = [yf.Ticker(symbol) for symbol in symbols]

View File

@ -7,3 +7,37 @@ _src_dp = _parent_dp
sys.path.insert(0, _src_dp)
import yfinance
# Optional: see the exact requests that are made during tests:
# import logging
# logging.basicConfig(level=logging.DEBUG)
# Setup a session to rate-limit and cache persistently:
import datetime as _dt
import os
import appdirs as _ad
from requests import Session
from requests_cache import CacheMixin, SQLiteCache
from requests_ratelimiter import LimiterMixin, MemoryQueueBucket
class CachedLimiterSession(CacheMixin, LimiterMixin, Session):
pass
from pyrate_limiter import Duration, RequestRate, Limiter
history_rate = RequestRate(1, Duration.SECOND*2)
limiter = Limiter(history_rate)
cache_fp = os.path.join(_ad.user_cache_dir(), "py-yfinance", "unittests-cache")
if os.path.isfile(cache_fp + '.sqlite'):
# Delete local cache if older than 1 day:
mod_dt = _dt.datetime.fromtimestamp(os.path.getmtime(cache_fp + '.sqlite'))
if mod_dt.date() < _dt.date.today():
os.remove(cache_fp + '.sqlite')
session_gbl = CachedLimiterSession(
limiter=limiter,
bucket_class=MemoryQueueBucket,
backend=SQLiteCache(cache_fp, expire_after=_dt.timedelta(hours=1)),
)
# Use this instead if only want rate-limiting:
# from requests_ratelimiter import LimiterSession
# session_gbl = LimiterSession(limiter=limiter)

View File

@ -0,0 +1,23 @@
Date,Open,High,Low,Close,Adj Close,Volume,Dividends,Stock Splits
2023-04-14 00:00:00+09:00,4126,4130,4055,4129,4129,7459400,0,0
2023-04-13 00:00:00+09:00,4064,4099,4026,4081,4081,5160200,0,0
2023-04-12 00:00:00+09:00,3968,4084,3966,4064,4064,6372000,0,0
2023-04-11 00:00:00+09:00,3990,4019,3954,3960,3960,6476500,0,0
2023-04-10 00:00:00+09:00,3996,4009,3949,3964,3964,3485200,0,0
2023-04-07 00:00:00+09:00,3897,3975,3892,3953,3953,4554700,0,0
2023-04-06 00:00:00+09:00,4002,4004,3920,3942,3942,8615200,0,0
2023-04-05 00:00:00+09:00,4150,4150,4080,4088,4088,6063700,0,0
2023-04-04 00:00:00+09:00,4245,4245,4144,4155,4155,6780600,0,0
2023-04-03 00:00:00+09:00,4250,4259,4162,4182,4182,7076800,0,0
2023-03-31 00:00:00+09:00,4229,4299,4209,4275,4275,9608400,0,0
2023-03-30 00:00:00+09:00,4257,4268,4119,4161,4161,5535200,55,5
2023-03-29 00:00:00+09:00,4146,4211,4146,4206,4151,6514500,0,0
2023-03-28 00:00:00+09:00,4200,4207,4124,4142,4087.837109375,4505500,0,0
2023-03-27 00:00:00+09:00,4196,4204,4151,4192,4137.183203125,5959500,0,0
2023-03-24 00:00:00+09:00,4130,4187,4123,4177,4122.379296875,8961500,0,0
2023-03-23 00:00:00+09:00,4056,4106,4039,4086,4032.569140625,5480000,0,0
2023-03-22 00:00:00+09:00,4066,4128,4057,4122,4068.0984375,8741500,0,0
2023-03-20 00:00:00+09:00,4000,4027,3980,3980,3927.95546875,7006500,0,0
2023-03-17 00:00:00+09:00,4018,4055,4016,4031,3978.28828125,6961500,0,0
2023-03-16 00:00:00+09:00,3976,4045,3972,4035,3982.236328125,5019000,0,0
2023-03-15 00:00:00+09:00,4034,4050,4003,4041,3988.1578125,6122000,0,0
1 Date Open High Low Close Adj Close Volume Dividends Stock Splits
2 2023-04-14 00:00:00+09:00 4126 4130 4055 4129 4129 7459400 0 0
3 2023-04-13 00:00:00+09:00 4064 4099 4026 4081 4081 5160200 0 0
4 2023-04-12 00:00:00+09:00 3968 4084 3966 4064 4064 6372000 0 0
5 2023-04-11 00:00:00+09:00 3990 4019 3954 3960 3960 6476500 0 0
6 2023-04-10 00:00:00+09:00 3996 4009 3949 3964 3964 3485200 0 0
7 2023-04-07 00:00:00+09:00 3897 3975 3892 3953 3953 4554700 0 0
8 2023-04-06 00:00:00+09:00 4002 4004 3920 3942 3942 8615200 0 0
9 2023-04-05 00:00:00+09:00 4150 4150 4080 4088 4088 6063700 0 0
10 2023-04-04 00:00:00+09:00 4245 4245 4144 4155 4155 6780600 0 0
11 2023-04-03 00:00:00+09:00 4250 4259 4162 4182 4182 7076800 0 0
12 2023-03-31 00:00:00+09:00 4229 4299 4209 4275 4275 9608400 0 0
13 2023-03-30 00:00:00+09:00 4257 4268 4119 4161 4161 5535200 55 5
14 2023-03-29 00:00:00+09:00 4146 4211 4146 4206 4151 6514500 0 0
15 2023-03-28 00:00:00+09:00 4200 4207 4124 4142 4087.837109375 4505500 0 0
16 2023-03-27 00:00:00+09:00 4196 4204 4151 4192 4137.183203125 5959500 0 0
17 2023-03-24 00:00:00+09:00 4130 4187 4123 4177 4122.379296875 8961500 0 0
18 2023-03-23 00:00:00+09:00 4056 4106 4039 4086 4032.569140625 5480000 0 0
19 2023-03-22 00:00:00+09:00 4066 4128 4057 4122 4068.0984375 8741500 0 0
20 2023-03-20 00:00:00+09:00 4000 4027 3980 3980 3927.95546875 7006500 0 0
21 2023-03-17 00:00:00+09:00 4018 4055 4016 4031 3978.28828125 6961500 0 0
22 2023-03-16 00:00:00+09:00 3976 4045 3972 4035 3982.236328125 5019000 0 0
23 2023-03-15 00:00:00+09:00 4034 4050 4003 4041 3988.1578125 6122000 0 0

View File

@ -0,0 +1,23 @@
Date,Open,High,Low,Close,Adj Close,Volume,Dividends,Stock Splits
2023-04-14 00:00:00+09:00,4126,4130,4055,4129,4129,7459400,0,0
2023-04-13 00:00:00+09:00,4064,4099,4026,4081,4081,5160200,0,0
2023-04-12 00:00:00+09:00,3968,4084,3966,4064,4064,6372000,0,0
2023-04-11 00:00:00+09:00,3990,4019,3954,3960,3960,6476500,0,0
2023-04-10 00:00:00+09:00,3996,4009,3949,3964,3964,3485200,0,0
2023-04-07 00:00:00+09:00,3897,3975,3892,3953,3953,4554700,0,0
2023-04-06 00:00:00+09:00,4002,4004,3920,3942,3942,8615200,0,0
2023-04-05 00:00:00+09:00,4150,4150,4080,4088,4088,6063700,0,0
2023-04-04 00:00:00+09:00,4245,4245,4144,4155,4155,6780600,0,0
2023-04-03 00:00:00+09:00,4250,4259,4162,4182,4182,7076800,0,0
2023-03-31 00:00:00+09:00,4229,4299,4209,4275,4275,9608400,0,0
2023-03-30 00:00:00+09:00,4257,4268,4119,4161,4161,5535200,55,5
2023-03-29 00:00:00+09:00,4146,4211,4146,4206,4151,6514500,0,0
2023-03-28 00:00:00+09:00,21000,21035,20620,20710,20439.185546875,901100,0,0
2023-03-27 00:00:00+09:00,20980,21020,20755,20960,20685.916015625,1191900,0,0
2023-03-24 00:00:00+09:00,20650,20935,20615,20885,20611.896484375,1792300,0,0
2023-03-23 00:00:00+09:00,20280,20530,20195,20430,20162.845703125,1096000,0,0
2023-03-22 00:00:00+09:00,20330,20640,20285,20610,20340.4921875,1748300,0,0
2023-03-20 00:00:00+09:00,20000,20135,19900,19900,19639.77734375,1401300,0,0
2023-03-17 00:00:00+09:00,20090,20275,20080,20155,19891.44140625,1392300,0,0
2023-03-16 00:00:00+09:00,19880,20225,19860,20175,19911.181640625,1003800,0,0
2023-03-15 00:00:00+09:00,20170,20250,20015,20205,19940.7890625,1224400,0,0
1 Date Open High Low Close Adj Close Volume Dividends Stock Splits
2 2023-04-14 00:00:00+09:00 4126 4130 4055 4129 4129 7459400 0 0
3 2023-04-13 00:00:00+09:00 4064 4099 4026 4081 4081 5160200 0 0
4 2023-04-12 00:00:00+09:00 3968 4084 3966 4064 4064 6372000 0 0
5 2023-04-11 00:00:00+09:00 3990 4019 3954 3960 3960 6476500 0 0
6 2023-04-10 00:00:00+09:00 3996 4009 3949 3964 3964 3485200 0 0
7 2023-04-07 00:00:00+09:00 3897 3975 3892 3953 3953 4554700 0 0
8 2023-04-06 00:00:00+09:00 4002 4004 3920 3942 3942 8615200 0 0
9 2023-04-05 00:00:00+09:00 4150 4150 4080 4088 4088 6063700 0 0
10 2023-04-04 00:00:00+09:00 4245 4245 4144 4155 4155 6780600 0 0
11 2023-04-03 00:00:00+09:00 4250 4259 4162 4182 4182 7076800 0 0
12 2023-03-31 00:00:00+09:00 4229 4299 4209 4275 4275 9608400 0 0
13 2023-03-30 00:00:00+09:00 4257 4268 4119 4161 4161 5535200 55 5
14 2023-03-29 00:00:00+09:00 4146 4211 4146 4206 4151 6514500 0 0
15 2023-03-28 00:00:00+09:00 21000 21035 20620 20710 20439.185546875 901100 0 0
16 2023-03-27 00:00:00+09:00 20980 21020 20755 20960 20685.916015625 1191900 0 0
17 2023-03-24 00:00:00+09:00 20650 20935 20615 20885 20611.896484375 1792300 0 0
18 2023-03-23 00:00:00+09:00 20280 20530 20195 20430 20162.845703125 1096000 0 0
19 2023-03-22 00:00:00+09:00 20330 20640 20285 20610 20340.4921875 1748300 0 0
20 2023-03-20 00:00:00+09:00 20000 20135 19900 19900 19639.77734375 1401300 0 0
21 2023-03-17 00:00:00+09:00 20090 20275 20080 20155 19891.44140625 1392300 0 0
22 2023-03-16 00:00:00+09:00 19880 20225 19860 20175 19911.181640625 1003800 0 0
23 2023-03-15 00:00:00+09:00 20170 20250 20015 20205 19940.7890625 1224400 0 0

View File

@ -0,0 +1,30 @@
Date,Open,High,Low,Close,Adj Close,Volume,Dividends,Stock Splits
2023-04-20 00:00:00+02:00,3,3,2,3,3,2076,0,0
2023-04-21 00:00:00+02:00,3,3,2,3,3,2136,0,0
2023-04-24 00:00:00+02:00,3,3,1,1,1,77147,0,0
2023-04-25 00:00:00+02:00,1,2,1,2,2,9625,0,0
2023-04-26 00:00:00+02:00,2,2,1,2,2,5028,0,0
2023-04-27 00:00:00+02:00,2,2,1,1,1,3235,0,0
2023-04-28 00:00:00+02:00,2,2,1,2,2,10944,0,0
2023-05-02 00:00:00+02:00,2,2,2,2,2,12220,0,0
2023-05-03 00:00:00+02:00,2,2,2,2,2,4683,0,0
2023-05-04 00:00:00+02:00,2,2,1,2,2,3368,0,0
2023-05-05 00:00:00+02:00,2,2,1,2,2,26069,0,0
2023-05-08 00:00:00+02:00,1,2,1,1,1,70540,0,0
2023-05-09 00:00:00+02:00,1,2,1,1,1,14228,0,0
2023-05-10 00:00:00+02:00,1.08000004291534,1.39999997615814,0.879999995231628,1,1,81012,0,0.0001
2023-05-11 00:00:00+02:00,1.03999996185303,1.03999996185303,0.850000023841858,1,1,40254,0,0
2023-05-12 00:00:00+02:00,0.949999988079071,1.10000002384186,0.949999988079071,1.01999998092651,1.01999998092651,35026,0,0
2023-05-15 00:00:00+02:00,0.949999988079071,1.01999998092651,0.860000014305115,0.939999997615814,0.939999997615814,41486,0,0
2023-05-16 00:00:00+02:00,0.899999976158142,0.944000005722046,0.800000011920929,0.800000011920929,0.800000011920929,43583,0,0
2023-05-17 00:00:00+02:00,0.850000023841858,0.850000023841858,0.779999971389771,0.810000002384186,0.810000002384186,29984,0,0
2023-05-18 00:00:00+02:00,0.779999971389771,0.78600001335144,0.740000009536743,0.740000009536743,0.740000009536743,24679,0,0
2023-05-19 00:00:00+02:00,0.78600001335144,0.78600001335144,0.649999976158142,0.65200001001358,0.65200001001358,26732,0,0
2023-05-22 00:00:00+02:00,0.8299999833107,1.05999994277954,0.709999978542328,0.709999978542328,0.709999978542328,169538,0,0
2023-05-23 00:00:00+02:00,0.899999976158142,1.60800004005432,0.860000014305115,1.22000002861023,1.22000002861023,858471,0,0
2023-05-24 00:00:00+02:00,1.19400000572205,1.25999999046326,0.779999971389771,0.779999971389771,0.779999971389771,627823,0,0
2023-05-25 00:00:00+02:00,0.980000019073486,1.22000002861023,0.702000021934509,0.732999980449677,0.732999980449677,1068939,0,0
2023-05-26 00:00:00+02:00,0.660000026226044,0.72000002861023,0.602999985218048,0.611999988555908,0.611999988555908,631580,0,0
2023-05-29 00:00:00+02:00,0.620000004768372,0.75,0.578999996185303,0.600000023841858,0.600000023841858,586150,0,0
2023-05-30 00:00:00+02:00,0.610000014305115,0.634999990463257,0.497000008821487,0.497000008821487,0.497000008821487,552308,0,0
2023-05-31 00:00:00+02:00,0.458999991416931,0.469999998807907,0.374000012874603,0.379999995231628,0.379999995231628,899067,0,0
1 Date Open High Low Close Adj Close Volume Dividends Stock Splits
2 2023-04-20 00:00:00+02:00 3 3 2 3 3 2076 0 0
3 2023-04-21 00:00:00+02:00 3 3 2 3 3 2136 0 0
4 2023-04-24 00:00:00+02:00 3 3 1 1 1 77147 0 0
5 2023-04-25 00:00:00+02:00 1 2 1 2 2 9625 0 0
6 2023-04-26 00:00:00+02:00 2 2 1 2 2 5028 0 0
7 2023-04-27 00:00:00+02:00 2 2 1 1 1 3235 0 0
8 2023-04-28 00:00:00+02:00 2 2 1 2 2 10944 0 0
9 2023-05-02 00:00:00+02:00 2 2 2 2 2 12220 0 0
10 2023-05-03 00:00:00+02:00 2 2 2 2 2 4683 0 0
11 2023-05-04 00:00:00+02:00 2 2 1 2 2 3368 0 0
12 2023-05-05 00:00:00+02:00 2 2 1 2 2 26069 0 0
13 2023-05-08 00:00:00+02:00 1 2 1 1 1 70540 0 0
14 2023-05-09 00:00:00+02:00 1 2 1 1 1 14228 0 0
15 2023-05-10 00:00:00+02:00 1.08000004291534 1.39999997615814 0.879999995231628 1 1 81012 0 0.0001
16 2023-05-11 00:00:00+02:00 1.03999996185303 1.03999996185303 0.850000023841858 1 1 40254 0 0
17 2023-05-12 00:00:00+02:00 0.949999988079071 1.10000002384186 0.949999988079071 1.01999998092651 1.01999998092651 35026 0 0
18 2023-05-15 00:00:00+02:00 0.949999988079071 1.01999998092651 0.860000014305115 0.939999997615814 0.939999997615814 41486 0 0
19 2023-05-16 00:00:00+02:00 0.899999976158142 0.944000005722046 0.800000011920929 0.800000011920929 0.800000011920929 43583 0 0
20 2023-05-17 00:00:00+02:00 0.850000023841858 0.850000023841858 0.779999971389771 0.810000002384186 0.810000002384186 29984 0 0
21 2023-05-18 00:00:00+02:00 0.779999971389771 0.78600001335144 0.740000009536743 0.740000009536743 0.740000009536743 24679 0 0
22 2023-05-19 00:00:00+02:00 0.78600001335144 0.78600001335144 0.649999976158142 0.65200001001358 0.65200001001358 26732 0 0
23 2023-05-22 00:00:00+02:00 0.8299999833107 1.05999994277954 0.709999978542328 0.709999978542328 0.709999978542328 169538 0 0
24 2023-05-23 00:00:00+02:00 0.899999976158142 1.60800004005432 0.860000014305115 1.22000002861023 1.22000002861023 858471 0 0
25 2023-05-24 00:00:00+02:00 1.19400000572205 1.25999999046326 0.779999971389771 0.779999971389771 0.779999971389771 627823 0 0
26 2023-05-25 00:00:00+02:00 0.980000019073486 1.22000002861023 0.702000021934509 0.732999980449677 0.732999980449677 1068939 0 0
27 2023-05-26 00:00:00+02:00 0.660000026226044 0.72000002861023 0.602999985218048 0.611999988555908 0.611999988555908 631580 0 0
28 2023-05-29 00:00:00+02:00 0.620000004768372 0.75 0.578999996185303 0.600000023841858 0.600000023841858 586150 0 0
29 2023-05-30 00:00:00+02:00 0.610000014305115 0.634999990463257 0.497000008821487 0.497000008821487 0.497000008821487 552308 0 0
30 2023-05-31 00:00:00+02:00 0.458999991416931 0.469999998807907 0.374000012874603 0.379999995231628 0.379999995231628 899067 0 0

View File

@ -0,0 +1,30 @@
Date,Open,High,Low,Close,Adj Close,Volume,Dividends,Stock Splits
2023-04-20 00:00:00+02:00,3.0,3.0,2.0,3.0,3.0,2076,0.0,0.0
2023-04-21 00:00:00+02:00,3.0,3.0,2.0,3.0,3.0,2136,0.0,0.0
2023-04-24 00:00:00+02:00,3.0,3.0,1.0,1.0,1.0,77147,0.0,0.0
2023-04-25 00:00:00+02:00,1.0,2.0,1.0,2.0,2.0,9625,0.0,0.0
2023-04-26 00:00:00+02:00,2.0,2.0,1.0,2.0,2.0,5028,0.0,0.0
2023-04-27 00:00:00+02:00,2.0,2.0,1.0,1.0,1.0,3235,0.0,0.0
2023-04-28 00:00:00+02:00,2.0,2.0,1.0,2.0,2.0,10944,0.0,0.0
2023-05-02 00:00:00+02:00,2.0,2.0,2.0,2.0,2.0,12220,0.0,0.0
2023-05-03 00:00:00+02:00,2.0,2.0,2.0,2.0,2.0,4683,0.0,0.0
2023-05-04 00:00:00+02:00,2.0,2.0,1.0,2.0,2.0,3368,0.0,0.0
2023-05-05 00:00:00+02:00,2.0,2.0,1.0,2.0,2.0,26069,0.0,0.0
2023-05-08 00:00:00+02:00,9.999999747378752e-05,0.00019999999494757503,9.999999747378752e-05,9.999999747378752e-05,9.999999747378752e-05,705399568,0.0,0.0
2023-05-09 00:00:00+02:00,1.0,2.0,1.0,1.0,1.0,14228,0.0,0.0
2023-05-10 00:00:00+02:00,1.0800000429153442,1.399999976158142,0.8799999952316284,1.0,1.0,81012,0.0,0.0001
2023-05-11 00:00:00+02:00,1.0399999618530273,1.0399999618530273,0.8500000238418579,1.0,1.0,40254,0.0,0.0
2023-05-12 00:00:00+02:00,0.949999988079071,1.100000023841858,0.949999988079071,1.0199999809265137,1.0199999809265137,35026,0.0,0.0
2023-05-15 00:00:00+02:00,0.949999988079071,1.0199999809265137,0.8600000143051147,0.9399999976158142,0.9399999976158142,41486,0.0,0.0
2023-05-16 00:00:00+02:00,0.8999999761581421,0.9440000057220459,0.800000011920929,0.800000011920929,0.800000011920929,43583,0.0,0.0
2023-05-17 00:00:00+02:00,0.8500000238418579,0.8500000238418579,0.7799999713897705,0.8100000023841858,0.8100000023841858,29984,0.0,0.0
2023-05-18 00:00:00+02:00,0.7799999713897705,0.7860000133514404,0.7400000095367432,0.7400000095367432,0.7400000095367432,24679,0.0,0.0
2023-05-19 00:00:00+02:00,0.7860000133514404,0.7860000133514404,0.6499999761581421,0.6520000100135803,0.6520000100135803,26732,0.0,0.0
2023-05-22 00:00:00+02:00,0.8299999833106995,1.059999942779541,0.7099999785423279,0.7099999785423279,0.7099999785423279,169538,0.0,0.0
2023-05-23 00:00:00+02:00,0.8999999761581421,1.6080000400543213,0.8600000143051147,1.2200000286102295,1.2200000286102295,858471,0.0,0.0
2023-05-24 00:00:00+02:00,1.194000005722046,1.2599999904632568,0.7799999713897705,0.7799999713897705,0.7799999713897705,627823,0.0,0.0
2023-05-25 00:00:00+02:00,0.9800000190734863,1.2200000286102295,0.7020000219345093,0.7329999804496765,0.7329999804496765,1068939,0.0,0.0
2023-05-26 00:00:00+02:00,0.6600000262260437,0.7200000286102295,0.6029999852180481,0.6119999885559082,0.6119999885559082,631580,0.0,0.0
2023-05-29 00:00:00+02:00,0.6200000047683716,0.75,0.5789999961853027,0.6000000238418579,0.6000000238418579,586150,0.0,0.0
2023-05-30 00:00:00+02:00,0.6100000143051147,0.6349999904632568,0.4970000088214874,0.4970000088214874,0.4970000088214874,552308,0.0,0.0
2023-05-31 00:00:00+02:00,0.45899999141693115,0.4699999988079071,0.37400001287460327,0.3799999952316284,0.3799999952316284,899067,0.0,0.0
1 Date Open High Low Close Adj Close Volume Dividends Stock Splits
2 2023-04-20 00:00:00+02:00 3.0 3.0 2.0 3.0 3.0 2076 0.0 0.0
3 2023-04-21 00:00:00+02:00 3.0 3.0 2.0 3.0 3.0 2136 0.0 0.0
4 2023-04-24 00:00:00+02:00 3.0 3.0 1.0 1.0 1.0 77147 0.0 0.0
5 2023-04-25 00:00:00+02:00 1.0 2.0 1.0 2.0 2.0 9625 0.0 0.0
6 2023-04-26 00:00:00+02:00 2.0 2.0 1.0 2.0 2.0 5028 0.0 0.0
7 2023-04-27 00:00:00+02:00 2.0 2.0 1.0 1.0 1.0 3235 0.0 0.0
8 2023-04-28 00:00:00+02:00 2.0 2.0 1.0 2.0 2.0 10944 0.0 0.0
9 2023-05-02 00:00:00+02:00 2.0 2.0 2.0 2.0 2.0 12220 0.0 0.0
10 2023-05-03 00:00:00+02:00 2.0 2.0 2.0 2.0 2.0 4683 0.0 0.0
11 2023-05-04 00:00:00+02:00 2.0 2.0 1.0 2.0 2.0 3368 0.0 0.0
12 2023-05-05 00:00:00+02:00 2.0 2.0 1.0 2.0 2.0 26069 0.0 0.0
13 2023-05-08 00:00:00+02:00 9.999999747378752e-05 0.00019999999494757503 9.999999747378752e-05 9.999999747378752e-05 9.999999747378752e-05 705399568 0.0 0.0
14 2023-05-09 00:00:00+02:00 1.0 2.0 1.0 1.0 1.0 14228 0.0 0.0
15 2023-05-10 00:00:00+02:00 1.0800000429153442 1.399999976158142 0.8799999952316284 1.0 1.0 81012 0.0 0.0001
16 2023-05-11 00:00:00+02:00 1.0399999618530273 1.0399999618530273 0.8500000238418579 1.0 1.0 40254 0.0 0.0
17 2023-05-12 00:00:00+02:00 0.949999988079071 1.100000023841858 0.949999988079071 1.0199999809265137 1.0199999809265137 35026 0.0 0.0
18 2023-05-15 00:00:00+02:00 0.949999988079071 1.0199999809265137 0.8600000143051147 0.9399999976158142 0.9399999976158142 41486 0.0 0.0
19 2023-05-16 00:00:00+02:00 0.8999999761581421 0.9440000057220459 0.800000011920929 0.800000011920929 0.800000011920929 43583 0.0 0.0
20 2023-05-17 00:00:00+02:00 0.8500000238418579 0.8500000238418579 0.7799999713897705 0.8100000023841858 0.8100000023841858 29984 0.0 0.0
21 2023-05-18 00:00:00+02:00 0.7799999713897705 0.7860000133514404 0.7400000095367432 0.7400000095367432 0.7400000095367432 24679 0.0 0.0
22 2023-05-19 00:00:00+02:00 0.7860000133514404 0.7860000133514404 0.6499999761581421 0.6520000100135803 0.6520000100135803 26732 0.0 0.0
23 2023-05-22 00:00:00+02:00 0.8299999833106995 1.059999942779541 0.7099999785423279 0.7099999785423279 0.7099999785423279 169538 0.0 0.0
24 2023-05-23 00:00:00+02:00 0.8999999761581421 1.6080000400543213 0.8600000143051147 1.2200000286102295 1.2200000286102295 858471 0.0 0.0
25 2023-05-24 00:00:00+02:00 1.194000005722046 1.2599999904632568 0.7799999713897705 0.7799999713897705 0.7799999713897705 627823 0.0 0.0
26 2023-05-25 00:00:00+02:00 0.9800000190734863 1.2200000286102295 0.7020000219345093 0.7329999804496765 0.7329999804496765 1068939 0.0 0.0
27 2023-05-26 00:00:00+02:00 0.6600000262260437 0.7200000286102295 0.6029999852180481 0.6119999885559082 0.6119999885559082 631580 0.0 0.0
28 2023-05-29 00:00:00+02:00 0.6200000047683716 0.75 0.5789999961853027 0.6000000238418579 0.6000000238418579 586150 0.0 0.0
29 2023-05-30 00:00:00+02:00 0.6100000143051147 0.6349999904632568 0.4970000088214874 0.4970000088214874 0.4970000088214874 552308 0.0 0.0
30 2023-05-31 00:00:00+02:00 0.45899999141693115 0.4699999988079071 0.37400001287460327 0.3799999952316284 0.3799999952316284 899067 0.0 0.0

View File

@ -0,0 +1,11 @@
Date,Open,High,Low,Close,Adj Close,Volume,Dividends,Stock Splits
2023-05-18 00:00:00+01:00,193.220001220703,200.839996337891,193.220001220703,196.839996337891,196.839996337891,653125,0,0
2023-05-17 00:00:00+01:00,199.740005493164,207.738006591797,190.121994018555,197.860000610352,197.860000610352,822268,0,0
2023-05-16 00:00:00+01:00,215.600006103516,215.600006103516,201.149993896484,205.100006103516,205.100006103516,451009,243.93939,0.471428571428571
2023-05-15 00:00:00+01:00,215.399955531529,219.19995640346,210.599967302595,217.399987792969,102.39998147147,1761679.3939394,0,0
2023-05-12 00:00:00+01:00,214.599988664899,216.199965558733,209.599965558733,211.399977329799,99.573855808803,1522298.48484849,0,0
2023-05-11 00:00:00+01:00,219.999966430664,219.999966430664,212.199987357003,215.000000871931,101.269541277204,3568042.12121213,0,0
2023-05-10 00:00:00+01:00,218.199954659598,223.000000435965,212.59995640346,215.399955531529,101.457929992676,5599908.78787879,0,0
2023-05-09 00:00:00+01:00,224,227.688003540039,218.199996948242,218.399993896484,102.87100982666,1906090,0,0
2023-05-05 00:00:00+01:00,220.999968174526,225.19996686663,220.799976457868,224.4,105.697140066964,964523.636363637,0,0
2023-05-04 00:00:00+01:00,216.999989972796,222.799965558733,216.881988961356,221.399965994698,104.284055655343,880983.93939394,0,0
1 Date Open High Low Close Adj Close Volume Dividends Stock Splits
2 2023-05-18 00:00:00+01:00 193.220001220703 200.839996337891 193.220001220703 196.839996337891 196.839996337891 653125 0 0
3 2023-05-17 00:00:00+01:00 199.740005493164 207.738006591797 190.121994018555 197.860000610352 197.860000610352 822268 0 0
4 2023-05-16 00:00:00+01:00 215.600006103516 215.600006103516 201.149993896484 205.100006103516 205.100006103516 451009 243.93939 0.471428571428571
5 2023-05-15 00:00:00+01:00 215.399955531529 219.19995640346 210.599967302595 217.399987792969 102.39998147147 1761679.3939394 0 0
6 2023-05-12 00:00:00+01:00 214.599988664899 216.199965558733 209.599965558733 211.399977329799 99.573855808803 1522298.48484849 0 0
7 2023-05-11 00:00:00+01:00 219.999966430664 219.999966430664 212.199987357003 215.000000871931 101.269541277204 3568042.12121213 0 0
8 2023-05-10 00:00:00+01:00 218.199954659598 223.000000435965 212.59995640346 215.399955531529 101.457929992676 5599908.78787879 0 0
9 2023-05-09 00:00:00+01:00 224 227.688003540039 218.199996948242 218.399993896484 102.87100982666 1906090 0 0
10 2023-05-05 00:00:00+01:00 220.999968174526 225.19996686663 220.799976457868 224.4 105.697140066964 964523.636363637 0 0
11 2023-05-04 00:00:00+01:00 216.999989972796 222.799965558733 216.881988961356 221.399965994698 104.284055655343 880983.93939394 0 0

View File

@ -0,0 +1,11 @@
Date,Open,High,Low,Close,Adj Close,Volume,Dividends,Stock Splits
2023-05-18 00:00:00+01:00,193.220001220703,200.839996337891,193.220001220703,196.839996337891,196.839996337891,653125,0,0
2023-05-17 00:00:00+01:00,199.740005493164,207.738006591797,190.121994018555,197.860000610352,197.860000610352,822268,0,0
2023-05-16 00:00:00+01:00,215.600006103516,215.600006103516,201.149993896484,205.100006103516,205.100006103516,451009,243.93939,0.471428571428571
2023-05-15 00:00:00+01:00,456.908996582031,464.969604492188,446.727203369141,461.151489257813,217.21208190918,830506,0,0
2023-05-12 00:00:00+01:00,455.212097167969,458.605987548828,444.605987548828,448.424194335938,211.217269897461,717655,0,0
2023-05-11 00:00:00+01:00,466.666595458984,466.666595458984,450.121185302734,456.060607910156,214.814178466797,1682077,0,0
2023-05-10 00:00:00+01:00,462.848388671875,473.030303955078,450.969604492188,456.908996582031,215.213790893555,2639957,0,0
2023-05-09 00:00:00+01:00,224,227.688003540039,218.199996948242,218.399993896484,102.87100982666,1906090,0,0
2023-05-05 00:00:00+01:00,468.787811279297,477.696899414063,468.363586425781,476,224.2060546875,454704,0,0
2023-05-04 00:00:00+01:00,460.303009033203,472.605987548828,460.052703857422,469.636291503906,221.208602905273,415321,0,0
1 Date Open High Low Close Adj Close Volume Dividends Stock Splits
2 2023-05-18 00:00:00+01:00 193.220001220703 200.839996337891 193.220001220703 196.839996337891 196.839996337891 653125 0 0
3 2023-05-17 00:00:00+01:00 199.740005493164 207.738006591797 190.121994018555 197.860000610352 197.860000610352 822268 0 0
4 2023-05-16 00:00:00+01:00 215.600006103516 215.600006103516 201.149993896484 205.100006103516 205.100006103516 451009 243.93939 0.471428571428571
5 2023-05-15 00:00:00+01:00 456.908996582031 464.969604492188 446.727203369141 461.151489257813 217.21208190918 830506 0 0
6 2023-05-12 00:00:00+01:00 455.212097167969 458.605987548828 444.605987548828 448.424194335938 211.217269897461 717655 0 0
7 2023-05-11 00:00:00+01:00 466.666595458984 466.666595458984 450.121185302734 456.060607910156 214.814178466797 1682077 0 0
8 2023-05-10 00:00:00+01:00 462.848388671875 473.030303955078 450.969604492188 456.908996582031 215.213790893555 2639957 0 0
9 2023-05-09 00:00:00+01:00 224 227.688003540039 218.199996948242 218.399993896484 102.87100982666 1906090 0 0
10 2023-05-05 00:00:00+01:00 468.787811279297 477.696899414063 468.363586425781 476 224.2060546875 454704 0 0
11 2023-05-04 00:00:00+01:00 460.303009033203 472.605987548828 460.052703857422 469.636291503906 221.208602905273 415321 0 0

View File

@ -0,0 +1,24 @@
Date,Open,High,Low,Close,Adj Close,Volume,Dividends,Stock Splits
2023-05-31 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
2023-05-30 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0.4406
2023-05-29 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
2023-05-26 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
2023-05-25 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
2023-05-24 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
2023-05-23 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
2023-05-22 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
2023-05-19 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
2023-05-18 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
2023-05-17 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
2023-05-16 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
2023-05-15 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
2023-05-12 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
2023-05-11 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
2023-05-10 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
2023-05-09 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
2023-05-08 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
2023-05-05 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
2023-05-04 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
2023-05-03 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
2023-05-02 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
2023-05-01 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
1 Date Open High Low Close Adj Close Volume Dividends Stock Splits
2 2023-05-31 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0
3 2023-05-30 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0.4406
4 2023-05-29 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0
5 2023-05-26 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0
6 2023-05-25 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0
7 2023-05-24 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0
8 2023-05-23 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0
9 2023-05-22 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0
10 2023-05-19 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0
11 2023-05-18 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0
12 2023-05-17 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0
13 2023-05-16 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0
14 2023-05-15 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0
15 2023-05-12 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0
16 2023-05-11 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0
17 2023-05-10 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0
18 2023-05-09 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0
19 2023-05-08 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0
20 2023-05-05 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0
21 2023-05-04 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0
22 2023-05-03 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0
23 2023-05-02 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0
24 2023-05-01 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0

View File

@ -0,0 +1,24 @@
Date,Open,High,Low,Close,Adj Close,Volume,Dividends,Stock Splits
2023-05-31 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
2023-05-30 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0.4406
2023-05-29 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
2023-05-26 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
2023-05-25 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
2023-05-24 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
2023-05-23 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
2023-05-22 00:00:00+10:00,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0.120290003716946,0,0,0
2023-05-19 00:00:00+10:00,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0,0,0
2023-05-18 00:00:00+10:00,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0,0,0
2023-05-17 00:00:00+10:00,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0,0,0
2023-05-16 00:00:00+10:00,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0,0,0
2023-05-15 00:00:00+10:00,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0,0,0
2023-05-12 00:00:00+10:00,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0,0,0
2023-05-11 00:00:00+10:00,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0,0,0
2023-05-10 00:00:00+10:00,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0,0,0
2023-05-09 00:00:00+10:00,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0,0,0
2023-05-08 00:00:00+10:00,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0,0,0
2023-05-05 00:00:00+10:00,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0,0,0
2023-05-04 00:00:00+10:00,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0,0,0
2023-05-03 00:00:00+10:00,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0,0,0
2023-05-02 00:00:00+10:00,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0,0,0
2023-05-01 00:00:00+10:00,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0.0529999993741512,0,0,0
1 Date Open High Low Close Adj Close Volume Dividends Stock Splits
2 2023-05-31 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0
3 2023-05-30 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0.4406
4 2023-05-29 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0
5 2023-05-26 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0
6 2023-05-25 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0
7 2023-05-24 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0
8 2023-05-23 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0
9 2023-05-22 00:00:00+10:00 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0.120290003716946 0 0 0
10 2023-05-19 00:00:00+10:00 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0 0 0
11 2023-05-18 00:00:00+10:00 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0 0 0
12 2023-05-17 00:00:00+10:00 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0 0 0
13 2023-05-16 00:00:00+10:00 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0 0 0
14 2023-05-15 00:00:00+10:00 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0 0 0
15 2023-05-12 00:00:00+10:00 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0 0 0
16 2023-05-11 00:00:00+10:00 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0 0 0
17 2023-05-10 00:00:00+10:00 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0 0 0
18 2023-05-09 00:00:00+10:00 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0 0 0
19 2023-05-08 00:00:00+10:00 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0 0 0
20 2023-05-05 00:00:00+10:00 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0 0 0
21 2023-05-04 00:00:00+10:00 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0 0 0
22 2023-05-03 00:00:00+10:00 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0 0 0
23 2023-05-02 00:00:00+10:00 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0 0 0
24 2023-05-01 00:00:00+10:00 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0.0529999993741512 0 0 0

View File

@ -0,0 +1,17 @@
Date,Open,High,Low,Close,Adj Close,Volume,Dividends,Stock Splits
2023-05-08 00:00:00+02:00,24.8999996185303,24.9500007629395,24.1000003814697,24.75,24.75,7187,0,0
2023-05-09 00:00:00+02:00,25,25.5,23.1499996185303,24.1499996185303,24.1499996185303,22753,0,0
2023-05-10 00:00:00+02:00,24.1499996185303,24.1499996185303,22,22.9500007629395,22.9500007629395,62727,0,0
2023-05-11 00:00:00+02:00,22.9500007629395,25,22.9500007629395,23.3500003814697,23.3500003814697,19550,0,0
2023-05-12 00:00:00+02:00,23.3500003814697,24,22.1000003814697,23.8500003814697,23.8500003814697,17143,0,0
2023-05-15 00:00:00+02:00,23,25.7999992370605,22.5,23,23,43709,0,0
2023-05-16 00:00:00+02:00,22.75,24.0499992370605,22.5,22.75,22.75,16068,0,0
2023-05-17 00:00:00+02:00,23,23.8500003814697,22.1000003814697,23.6499996185303,23.6499996185303,19926,0,0
2023-05-19 00:00:00+02:00,23.6499996185303,23.8500003814697,22.1000003814697,22.2999992370605,22.2999992370605,41050,0,0
2023-05-22 00:00:00+02:00,22.0000004768372,24.1499996185303,21.5499997138977,22.7500009536743,22.7500009536743,34022,0,0
2023-05-23 00:00:00+02:00,22.75,22.8999996185303,21.75,22.5,22.5,13992,0,0
2023-05-24 00:00:00+02:00,21,24,21,22.0100002288818,22.0100002288818,18306,0,0.1
2023-05-25 00:00:00+02:00,21.5699996948242,22.8899993896484,20,21.1599998474121,21.1599998474121,35398,0,0
2023-05-26 00:00:00+02:00,21.1599998474121,22.4950008392334,20.5,21.0949993133545,21.0949993133545,8039,0,0
2023-05-29 00:00:00+02:00,22.1000003814697,22.1000003814697,20.25,20.75,20.75,17786,0,0
2023-05-30 00:00:00+02:00,20.75,21.6499996185303,20.1499996185303,20.4500007629395,20.4500007629395,10709,0,0
1 Date Open High Low Close Adj Close Volume Dividends Stock Splits
2 2023-05-08 00:00:00+02:00 24.8999996185303 24.9500007629395 24.1000003814697 24.75 24.75 7187 0 0
3 2023-05-09 00:00:00+02:00 25 25.5 23.1499996185303 24.1499996185303 24.1499996185303 22753 0 0
4 2023-05-10 00:00:00+02:00 24.1499996185303 24.1499996185303 22 22.9500007629395 22.9500007629395 62727 0 0
5 2023-05-11 00:00:00+02:00 22.9500007629395 25 22.9500007629395 23.3500003814697 23.3500003814697 19550 0 0
6 2023-05-12 00:00:00+02:00 23.3500003814697 24 22.1000003814697 23.8500003814697 23.8500003814697 17143 0 0
7 2023-05-15 00:00:00+02:00 23 25.7999992370605 22.5 23 23 43709 0 0
8 2023-05-16 00:00:00+02:00 22.75 24.0499992370605 22.5 22.75 22.75 16068 0 0
9 2023-05-17 00:00:00+02:00 23 23.8500003814697 22.1000003814697 23.6499996185303 23.6499996185303 19926 0 0
10 2023-05-19 00:00:00+02:00 23.6499996185303 23.8500003814697 22.1000003814697 22.2999992370605 22.2999992370605 41050 0 0
11 2023-05-22 00:00:00+02:00 22.0000004768372 24.1499996185303 21.5499997138977 22.7500009536743 22.7500009536743 34022 0 0
12 2023-05-23 00:00:00+02:00 22.75 22.8999996185303 21.75 22.5 22.5 13992 0 0
13 2023-05-24 00:00:00+02:00 21 24 21 22.0100002288818 22.0100002288818 18306 0 0.1
14 2023-05-25 00:00:00+02:00 21.5699996948242 22.8899993896484 20 21.1599998474121 21.1599998474121 35398 0 0
15 2023-05-26 00:00:00+02:00 21.1599998474121 22.4950008392334 20.5 21.0949993133545 21.0949993133545 8039 0 0
16 2023-05-29 00:00:00+02:00 22.1000003814697 22.1000003814697 20.25 20.75 20.75 17786 0 0
17 2023-05-30 00:00:00+02:00 20.75 21.6499996185303 20.1499996185303 20.4500007629395 20.4500007629395 10709 0 0

View File

@ -0,0 +1,17 @@
Date,Open,High,Low,Close,Adj Close,Volume,Dividends,Stock Splits
2023-05-08 00:00:00+02:00,24.899999618530273,24.950000762939453,24.100000381469727,24.75,24.75,7187,0.0,0.0
2023-05-09 00:00:00+02:00,25.0,25.5,23.149999618530273,24.149999618530273,24.149999618530273,22753,0.0,0.0
2023-05-10 00:00:00+02:00,24.149999618530273,24.149999618530273,22.0,22.950000762939453,22.950000762939453,62727,0.0,0.0
2023-05-11 00:00:00+02:00,22.950000762939453,25.0,22.950000762939453,23.350000381469727,23.350000381469727,19550,0.0,0.0
2023-05-12 00:00:00+02:00,23.350000381469727,24.0,22.100000381469727,23.850000381469727,23.850000381469727,17143,0.0,0.0
2023-05-15 00:00:00+02:00,23.0,25.799999237060547,22.5,23.0,23.0,43709,0.0,0.0
2023-05-16 00:00:00+02:00,22.75,24.049999237060547,22.5,22.75,22.75,16068,0.0,0.0
2023-05-17 00:00:00+02:00,23.0,23.850000381469727,22.100000381469727,23.649999618530273,23.649999618530273,19926,0.0,0.0
2023-05-19 00:00:00+02:00,23.649999618530273,23.850000381469727,22.100000381469727,22.299999237060547,22.299999237060547,41050,0.0,0.0
2023-05-22 00:00:00+02:00,2.200000047683716,2.4149999618530273,2.1549999713897705,2.2750000953674316,2.2750000953674316,340215,0.0,0.0
2023-05-23 00:00:00+02:00,22.75,22.899999618530273,21.75,22.5,22.5,13992,0.0,0.0
2023-05-24 00:00:00+02:00,21.0,24.0,21.0,22.010000228881836,22.010000228881836,18306,0.0,0.1
2023-05-25 00:00:00+02:00,21.56999969482422,22.889999389648438,20.0,21.15999984741211,21.15999984741211,35398,0.0,0.0
2023-05-26 00:00:00+02:00,21.15999984741211,22.4950008392334,20.5,21.094999313354492,21.094999313354492,8039,0.0,0.0
2023-05-29 00:00:00+02:00,22.100000381469727,22.100000381469727,20.25,20.75,20.75,17786,0.0,0.0
2023-05-30 00:00:00+02:00,20.75,21.649999618530273,20.149999618530273,20.450000762939453,20.450000762939453,10709,0.0,0.0
1 Date Open High Low Close Adj Close Volume Dividends Stock Splits
2 2023-05-08 00:00:00+02:00 24.899999618530273 24.950000762939453 24.100000381469727 24.75 24.75 7187 0.0 0.0
3 2023-05-09 00:00:00+02:00 25.0 25.5 23.149999618530273 24.149999618530273 24.149999618530273 22753 0.0 0.0
4 2023-05-10 00:00:00+02:00 24.149999618530273 24.149999618530273 22.0 22.950000762939453 22.950000762939453 62727 0.0 0.0
5 2023-05-11 00:00:00+02:00 22.950000762939453 25.0 22.950000762939453 23.350000381469727 23.350000381469727 19550 0.0 0.0
6 2023-05-12 00:00:00+02:00 23.350000381469727 24.0 22.100000381469727 23.850000381469727 23.850000381469727 17143 0.0 0.0
7 2023-05-15 00:00:00+02:00 23.0 25.799999237060547 22.5 23.0 23.0 43709 0.0 0.0
8 2023-05-16 00:00:00+02:00 22.75 24.049999237060547 22.5 22.75 22.75 16068 0.0 0.0
9 2023-05-17 00:00:00+02:00 23.0 23.850000381469727 22.100000381469727 23.649999618530273 23.649999618530273 19926 0.0 0.0
10 2023-05-19 00:00:00+02:00 23.649999618530273 23.850000381469727 22.100000381469727 22.299999237060547 22.299999237060547 41050 0.0 0.0
11 2023-05-22 00:00:00+02:00 2.200000047683716 2.4149999618530273 2.1549999713897705 2.2750000953674316 2.2750000953674316 340215 0.0 0.0
12 2023-05-23 00:00:00+02:00 22.75 22.899999618530273 21.75 22.5 22.5 13992 0.0 0.0
13 2023-05-24 00:00:00+02:00 21.0 24.0 21.0 22.010000228881836 22.010000228881836 18306 0.0 0.1
14 2023-05-25 00:00:00+02:00 21.56999969482422 22.889999389648438 20.0 21.15999984741211 21.15999984741211 35398 0.0 0.0
15 2023-05-26 00:00:00+02:00 21.15999984741211 22.4950008392334 20.5 21.094999313354492 21.094999313354492 8039 0.0 0.0
16 2023-05-29 00:00:00+02:00 22.100000381469727 22.100000381469727 20.25 20.75 20.75 17786 0.0 0.0
17 2023-05-30 00:00:00+02:00 20.75 21.649999618530273 20.149999618530273 20.450000762939453 20.450000762939453 10709 0.0 0.0

View File

@ -0,0 +1,23 @@
Date,Open,High,Low,Close,Adj Close,Volume,Dividends,Stock Splits
2022-06-01 00:00:00+02:00,5.72999992370606,5.78199996948242,5.3939998626709,5.3939998626709,5.3939998626709,3095860,0,0
2022-06-02 00:00:00+02:00,5.38600006103516,5.38600006103516,5.26800003051758,5.2939998626709,5.2939998626709,1662880,0,0
2022-06-03 00:00:00+02:00,5.34599990844727,5.34599990844727,5.15800018310547,5.16800003051758,5.16800003051758,1698900,0,0
2022-06-06 00:00:00+02:00,5.16800003051758,5.25200004577637,5.13800010681152,5.18800010681152,5.18800010681152,1074910,0,0
2022-06-07 00:00:00+02:00,5.21800003051758,5.22200012207031,5.07400016784668,5.1560001373291,5.1560001373291,1850680,0,0
2022-06-08 00:00:00+02:00,5.1560001373291,5.17599983215332,5.07200012207031,5.10200004577637,5.10200004577637,1140360,0,0
2022-06-09 00:00:00+02:00,5.09799995422363,5.09799995422363,4.87599983215332,4.8939998626709,4.8939998626709,2025480,0,0
2022-06-10 00:00:00+02:00,4.87999992370606,4.87999992370606,4.50400009155274,4.50400009155274,4.50400009155274,2982730,0,0
2022-06-13 00:00:00+02:00,4.3,4.37599983215332,3.83600006103516,3.83600006103516,3.83600006103516,4568210,0,0.1
2022-06-14 00:00:00+02:00,3.87750015258789,4.15999984741211,3.85200004577637,3.9439998626709,3.9439998626709,5354500,0,0
2022-06-15 00:00:00+02:00,4.03400001525879,4.16450004577637,3.73050003051758,3.73050003051758,3.73050003051758,6662610,0,0
2022-06-16 00:00:00+02:00,3.73050003051758,3.98499984741211,3.72400016784668,3.82550010681152,3.82550010681152,13379960,0,0
2022-06-17 00:00:00+02:00,3.8,4.29949989318848,3.75,4.29949989318848,4.29949989318848,12844160,0,0
2022-06-20 00:00:00+02:00,2.19422197341919,2.2295401096344,2.13992595672607,2.2295401096344,2.2295401096344,12364104,0,0
2022-06-21 00:00:00+02:00,2.24719905853272,2.28515291213989,2.19712090492249,2.21557092666626,2.21557092666626,8434013,0,0
2022-06-22 00:00:00+02:00,1.98679196834564,2.00365996360779,1.73798203468323,1.73798203468323,1.73798203468323,26496542,0,0
2022-06-23 00:00:00+02:00,1.62411904335022,1.68526804447174,1.37320005893707,1.59776198863983,1.59776198863983,48720201,0,0
2022-06-24 00:00:00+02:00,1.47599303722382,1.54610300064087,1.1739410161972,1.24932205677032,1.24932205677032,56877192,0,0
2022-06-27 00:00:00+02:00,1.49899995326996,1.79849994182587,1.49899995326996,1.79849994182587,1.79849994182587,460673,0,0
2022-06-28 00:00:00+02:00,2.15799999237061,3.05100011825562,2.12599992752075,3.05100011825562,3.05100011825562,3058635,0,0
2022-06-29 00:00:00+02:00,2.90000009536743,3.73799991607666,2.85899996757507,3.26399993896484,3.26399993896484,6516761,0,0
2022-06-30 00:00:00+02:00,3.24900007247925,3.28099989891052,2.5,2.5550000667572,2.5550000667572,4805984,0,0
1 Date Open High Low Close Adj Close Volume Dividends Stock Splits
2 2022-06-01 00:00:00+02:00 5.72999992370606 5.78199996948242 5.3939998626709 5.3939998626709 5.3939998626709 3095860 0 0
3 2022-06-02 00:00:00+02:00 5.38600006103516 5.38600006103516 5.26800003051758 5.2939998626709 5.2939998626709 1662880 0 0
4 2022-06-03 00:00:00+02:00 5.34599990844727 5.34599990844727 5.15800018310547 5.16800003051758 5.16800003051758 1698900 0 0
5 2022-06-06 00:00:00+02:00 5.16800003051758 5.25200004577637 5.13800010681152 5.18800010681152 5.18800010681152 1074910 0 0
6 2022-06-07 00:00:00+02:00 5.21800003051758 5.22200012207031 5.07400016784668 5.1560001373291 5.1560001373291 1850680 0 0
7 2022-06-08 00:00:00+02:00 5.1560001373291 5.17599983215332 5.07200012207031 5.10200004577637 5.10200004577637 1140360 0 0
8 2022-06-09 00:00:00+02:00 5.09799995422363 5.09799995422363 4.87599983215332 4.8939998626709 4.8939998626709 2025480 0 0
9 2022-06-10 00:00:00+02:00 4.87999992370606 4.87999992370606 4.50400009155274 4.50400009155274 4.50400009155274 2982730 0 0
10 2022-06-13 00:00:00+02:00 4.3 4.37599983215332 3.83600006103516 3.83600006103516 3.83600006103516 4568210 0 0.1
11 2022-06-14 00:00:00+02:00 3.87750015258789 4.15999984741211 3.85200004577637 3.9439998626709 3.9439998626709 5354500 0 0
12 2022-06-15 00:00:00+02:00 4.03400001525879 4.16450004577637 3.73050003051758 3.73050003051758 3.73050003051758 6662610 0 0
13 2022-06-16 00:00:00+02:00 3.73050003051758 3.98499984741211 3.72400016784668 3.82550010681152 3.82550010681152 13379960 0 0
14 2022-06-17 00:00:00+02:00 3.8 4.29949989318848 3.75 4.29949989318848 4.29949989318848 12844160 0 0
15 2022-06-20 00:00:00+02:00 2.19422197341919 2.2295401096344 2.13992595672607 2.2295401096344 2.2295401096344 12364104 0 0
16 2022-06-21 00:00:00+02:00 2.24719905853272 2.28515291213989 2.19712090492249 2.21557092666626 2.21557092666626 8434013 0 0
17 2022-06-22 00:00:00+02:00 1.98679196834564 2.00365996360779 1.73798203468323 1.73798203468323 1.73798203468323 26496542 0 0
18 2022-06-23 00:00:00+02:00 1.62411904335022 1.68526804447174 1.37320005893707 1.59776198863983 1.59776198863983 48720201 0 0
19 2022-06-24 00:00:00+02:00 1.47599303722382 1.54610300064087 1.1739410161972 1.24932205677032 1.24932205677032 56877192 0 0
20 2022-06-27 00:00:00+02:00 1.49899995326996 1.79849994182587 1.49899995326996 1.79849994182587 1.79849994182587 460673 0 0
21 2022-06-28 00:00:00+02:00 2.15799999237061 3.05100011825562 2.12599992752075 3.05100011825562 3.05100011825562 3058635 0 0
22 2022-06-29 00:00:00+02:00 2.90000009536743 3.73799991607666 2.85899996757507 3.26399993896484 3.26399993896484 6516761 0 0
23 2022-06-30 00:00:00+02:00 3.24900007247925 3.28099989891052 2.5 2.5550000667572 2.5550000667572 4805984 0 0

View File

@ -0,0 +1,23 @@
Date,Open,High,Low,Close,Adj Close,Volume,Dividends,Stock Splits
2022-06-01 00:00:00+02:00,57.29999923706055,57.81999969482422,53.939998626708984,53.939998626708984,53.939998626708984,309586,0.0,0.0
2022-06-02 00:00:00+02:00,53.86000061035156,53.86000061035156,52.68000030517578,52.939998626708984,52.939998626708984,166288,0.0,0.0
2022-06-03 00:00:00+02:00,53.459999084472656,53.459999084472656,51.58000183105469,51.68000030517578,51.68000030517578,169890,0.0,0.0
2022-06-06 00:00:00+02:00,51.68000030517578,52.52000045776367,51.380001068115234,51.880001068115234,51.880001068115234,107491,0.0,0.0
2022-06-07 00:00:00+02:00,52.18000030517578,52.220001220703125,50.7400016784668,51.560001373291016,51.560001373291016,185068,0.0,0.0
2022-06-08 00:00:00+02:00,51.560001373291016,51.7599983215332,50.720001220703125,51.02000045776367,51.02000045776367,114036,0.0,0.0
2022-06-09 00:00:00+02:00,50.97999954223633,50.97999954223633,48.7599983215332,48.939998626708984,48.939998626708984,202548,0.0,0.0
2022-06-10 00:00:00+02:00,48.79999923706055,48.79999923706055,45.040000915527344,45.040000915527344,45.040000915527344,298273,0.0,0.0
2022-06-13 00:00:00+02:00,43.0,43.7599983215332,38.36000061035156,38.36000061035156,38.36000061035156,456821,0.0,0.1
2022-06-14 00:00:00+02:00,38.775001525878906,41.599998474121094,38.52000045776367,39.439998626708984,39.439998626708984,535450,0.0,0.0
2022-06-15 00:00:00+02:00,40.34000015258789,41.64500045776367,37.30500030517578,37.30500030517578,37.30500030517578,666261,0.0,0.0
2022-06-16 00:00:00+02:00,37.30500030517578,39.849998474121094,37.2400016784668,38.255001068115234,38.255001068115234,1337996,0.0,0.0
2022-06-17 00:00:00+02:00,38.0,42.994998931884766,37.5,42.994998931884766,42.994998931884766,1284416,0.0,0.0
2022-06-20 00:00:00+02:00,2.1942219734191895,2.2295401096343994,2.139925956726074,2.2295401096343994,2.2295401096343994,12364104,0.0,0.0
2022-06-21 00:00:00+02:00,2.247199058532715,2.2851529121398926,2.1971209049224854,2.2155709266662598,2.2155709266662598,8434013,0.0,0.0
2022-06-22 00:00:00+02:00,1.986791968345642,2.003659963607788,1.7379820346832275,1.7379820346832275,1.7379820346832275,26496542,0.0,0.0
2022-06-23 00:00:00+02:00,1.6241190433502197,1.6852680444717407,1.3732000589370728,1.5977619886398315,1.5977619886398315,48720201,0.0,0.0
2022-06-24 00:00:00+02:00,1.475993037223816,1.5461030006408691,1.1739410161972046,1.2493220567703247,1.2493220567703247,56877192,0.0,0.0
2022-06-27 00:00:00+02:00,1.4989999532699585,1.7984999418258667,1.4989999532699585,1.7984999418258667,1.7984999418258667,460673,0.0,0.0
2022-06-28 00:00:00+02:00,2.1579999923706055,3.0510001182556152,2.125999927520752,3.0510001182556152,3.0510001182556152,3058635,0.0,0.0
2022-06-29 00:00:00+02:00,2.9000000953674316,3.73799991607666,2.8589999675750732,3.2639999389648438,3.2639999389648438,6516761,0.0,0.0
2022-06-30 00:00:00+02:00,3.249000072479248,3.2809998989105225,2.5,2.555000066757202,2.555000066757202,4805984,0.0,0.0
1 Date Open High Low Close Adj Close Volume Dividends Stock Splits
2 2022-06-01 00:00:00+02:00 57.29999923706055 57.81999969482422 53.939998626708984 53.939998626708984 53.939998626708984 309586 0.0 0.0
3 2022-06-02 00:00:00+02:00 53.86000061035156 53.86000061035156 52.68000030517578 52.939998626708984 52.939998626708984 166288 0.0 0.0
4 2022-06-03 00:00:00+02:00 53.459999084472656 53.459999084472656 51.58000183105469 51.68000030517578 51.68000030517578 169890 0.0 0.0
5 2022-06-06 00:00:00+02:00 51.68000030517578 52.52000045776367 51.380001068115234 51.880001068115234 51.880001068115234 107491 0.0 0.0
6 2022-06-07 00:00:00+02:00 52.18000030517578 52.220001220703125 50.7400016784668 51.560001373291016 51.560001373291016 185068 0.0 0.0
7 2022-06-08 00:00:00+02:00 51.560001373291016 51.7599983215332 50.720001220703125 51.02000045776367 51.02000045776367 114036 0.0 0.0
8 2022-06-09 00:00:00+02:00 50.97999954223633 50.97999954223633 48.7599983215332 48.939998626708984 48.939998626708984 202548 0.0 0.0
9 2022-06-10 00:00:00+02:00 48.79999923706055 48.79999923706055 45.040000915527344 45.040000915527344 45.040000915527344 298273 0.0 0.0
10 2022-06-13 00:00:00+02:00 43.0 43.7599983215332 38.36000061035156 38.36000061035156 38.36000061035156 456821 0.0 0.1
11 2022-06-14 00:00:00+02:00 38.775001525878906 41.599998474121094 38.52000045776367 39.439998626708984 39.439998626708984 535450 0.0 0.0
12 2022-06-15 00:00:00+02:00 40.34000015258789 41.64500045776367 37.30500030517578 37.30500030517578 37.30500030517578 666261 0.0 0.0
13 2022-06-16 00:00:00+02:00 37.30500030517578 39.849998474121094 37.2400016784668 38.255001068115234 38.255001068115234 1337996 0.0 0.0
14 2022-06-17 00:00:00+02:00 38.0 42.994998931884766 37.5 42.994998931884766 42.994998931884766 1284416 0.0 0.0
15 2022-06-20 00:00:00+02:00 2.1942219734191895 2.2295401096343994 2.139925956726074 2.2295401096343994 2.2295401096343994 12364104 0.0 0.0
16 2022-06-21 00:00:00+02:00 2.247199058532715 2.2851529121398926 2.1971209049224854 2.2155709266662598 2.2155709266662598 8434013 0.0 0.0
17 2022-06-22 00:00:00+02:00 1.986791968345642 2.003659963607788 1.7379820346832275 1.7379820346832275 1.7379820346832275 26496542 0.0 0.0
18 2022-06-23 00:00:00+02:00 1.6241190433502197 1.6852680444717407 1.3732000589370728 1.5977619886398315 1.5977619886398315 48720201 0.0 0.0
19 2022-06-24 00:00:00+02:00 1.475993037223816 1.5461030006408691 1.1739410161972046 1.2493220567703247 1.2493220567703247 56877192 0.0 0.0
20 2022-06-27 00:00:00+02:00 1.4989999532699585 1.7984999418258667 1.4989999532699585 1.7984999418258667 1.7984999418258667 460673 0.0 0.0
21 2022-06-28 00:00:00+02:00 2.1579999923706055 3.0510001182556152 2.125999927520752 3.0510001182556152 3.0510001182556152 3058635 0.0 0.0
22 2022-06-29 00:00:00+02:00 2.9000000953674316 3.73799991607666 2.8589999675750732 3.2639999389648438 3.2639999389648438 6516761 0.0 0.0
23 2022-06-30 00:00:00+02:00 3.249000072479248 3.2809998989105225 2.5 2.555000066757202 2.555000066757202 4805984 0.0 0.0

View File

@ -0,0 +1,30 @@
Date,Open,High,Low,Close,Adj Close,Volume,Dividends,Stock Splits
2023-06-09 00:00:00+02:00,34.700001,34.709999,33.240002,33.619999,33.619999,7148409,0,0
2023-06-08 00:00:00+02:00,34.900002,34.990002,34.040001,34.360001,34.360001,10406999,0,0
2023-06-07 00:00:00+02:00,34.549999,35.639999,34.320000,35.090000,35.090000,10118918,0,0
2023-06-06 00:00:00+02:00,34.500000,34.820000,34.049999,34.459999,34.459999,9109709,0,0
2023-06-05 00:00:00+02:00,35.000000,35.299999,34.200001,34.700001,34.700001,8791993,0,0
2023-06-02 00:00:00+02:00,35.689999,36.180000,34.599998,34.970001,34.970001,8844549,0,0
2023-06-01 00:00:00+02:00,35.230000,35.380001,34.240002,35.349998,35.349998,6721030,0,0
2023-05-31 00:00:00+02:00,3480,3548,3426,3501,3501,32605833,0,0
2023-05-30 00:00:00+02:00,3439,3537,3385,3423,3423,8970804,0,0
2023-05-29 00:00:00+02:00,3466,3506,3402,3432,3432,3912803,0,0
2023-05-26 00:00:00+02:00,3475,3599,3433,3453,3453,6744718,0,0
2023-05-25 00:00:00+02:00,3540,3609,3463,3507,3507,16900221,0,0
2023-05-24 00:00:00+02:00,3620,3650,3526,3540,3540,9049505,0,0
2023-05-23 00:00:00+02:00,3690,3667,3556,3610,3610,10797373,0,0
2023-05-22 00:00:00+02:00,3705,3736,3609,3661,3661,7132641,0,0
2023-05-19 00:00:00+02:00,3620,3715,3625,3690,3690,12648518,0,0
2023-05-18 00:00:00+02:00,3657,3699,3584,3646,3646,10674542,0,0
2023-05-17 00:00:00+02:00,3687,3731,3656,3671,3671,9892791,0,0
2023-05-16 00:00:00+02:00,3715,3773,3696,3703,3703,4706789,0,0
2023-05-15 00:00:00+02:00,3774,3805,3696,3727,3727,7890969,0,0
2023-05-12 00:00:00+02:00,3750,3844,3671,3774,3774,8724303,0,0
2023-05-11 00:00:00+02:00,3880,3888,3701,3732,3732,14371855,0,0
2023-05-10 00:00:00+02:00,3893,3880,3642,3810,3810,30393389,0,0
2023-05-09 00:00:00+02:00,4441,4441,3939,3966,3966,19833428,0,0
2023-05-08 00:00:00+02:00,4463,4578,4456,4471,4471,11092519,0,0
2023-05-05 00:00:00+02:00,4299,4490,4287,4458,4458,28539048,0,0
2023-05-04 00:00:00+02:00,4149,4330,4123,4283,4283,15506868,0,0
2023-05-03 00:00:00+02:00,3975,4098,3968,4095,4095,14657028,0,0
2023-05-02 00:00:00+02:00,4037,4032,3917,3965,3965,11818133,0,0
1 Date Open High Low Close Adj Close Volume Dividends Stock Splits
2 2023-06-09 00:00:00+02:00 34.700001 34.709999 33.240002 33.619999 33.619999 7148409 0 0
3 2023-06-08 00:00:00+02:00 34.900002 34.990002 34.040001 34.360001 34.360001 10406999 0 0
4 2023-06-07 00:00:00+02:00 34.549999 35.639999 34.320000 35.090000 35.090000 10118918 0 0
5 2023-06-06 00:00:00+02:00 34.500000 34.820000 34.049999 34.459999 34.459999 9109709 0 0
6 2023-06-05 00:00:00+02:00 35.000000 35.299999 34.200001 34.700001 34.700001 8791993 0 0
7 2023-06-02 00:00:00+02:00 35.689999 36.180000 34.599998 34.970001 34.970001 8844549 0 0
8 2023-06-01 00:00:00+02:00 35.230000 35.380001 34.240002 35.349998 35.349998 6721030 0 0
9 2023-05-31 00:00:00+02:00 3480 3548 3426 3501 3501 32605833 0 0
10 2023-05-30 00:00:00+02:00 3439 3537 3385 3423 3423 8970804 0 0
11 2023-05-29 00:00:00+02:00 3466 3506 3402 3432 3432 3912803 0 0
12 2023-05-26 00:00:00+02:00 3475 3599 3433 3453 3453 6744718 0 0
13 2023-05-25 00:00:00+02:00 3540 3609 3463 3507 3507 16900221 0 0
14 2023-05-24 00:00:00+02:00 3620 3650 3526 3540 3540 9049505 0 0
15 2023-05-23 00:00:00+02:00 3690 3667 3556 3610 3610 10797373 0 0
16 2023-05-22 00:00:00+02:00 3705 3736 3609 3661 3661 7132641 0 0
17 2023-05-19 00:00:00+02:00 3620 3715 3625 3690 3690 12648518 0 0
18 2023-05-18 00:00:00+02:00 3657 3699 3584 3646 3646 10674542 0 0
19 2023-05-17 00:00:00+02:00 3687 3731 3656 3671 3671 9892791 0 0
20 2023-05-16 00:00:00+02:00 3715 3773 3696 3703 3703 4706789 0 0
21 2023-05-15 00:00:00+02:00 3774 3805 3696 3727 3727 7890969 0 0
22 2023-05-12 00:00:00+02:00 3750 3844 3671 3774 3774 8724303 0 0
23 2023-05-11 00:00:00+02:00 3880 3888 3701 3732 3732 14371855 0 0
24 2023-05-10 00:00:00+02:00 3893 3880 3642 3810 3810 30393389 0 0
25 2023-05-09 00:00:00+02:00 4441 4441 3939 3966 3966 19833428 0 0
26 2023-05-08 00:00:00+02:00 4463 4578 4456 4471 4471 11092519 0 0
27 2023-05-05 00:00:00+02:00 4299 4490 4287 4458 4458 28539048 0 0
28 2023-05-04 00:00:00+02:00 4149 4330 4123 4283 4283 15506868 0 0
29 2023-05-03 00:00:00+02:00 3975 4098 3968 4095 4095 14657028 0 0
30 2023-05-02 00:00:00+02:00 4037 4032 3917 3965 3965 11818133 0 0

View File

@ -1,21 +1,19 @@
from .context import yfinance as yf
from .context import session_gbl
import unittest
import os
import datetime as _dt
import pytz as _tz
import numpy as _np
import pandas as _pd
import requests_cache
class TestPriceHistory(unittest.TestCase):
session = None
@classmethod
def setUpClass(cls):
cls.session = requests_cache.CachedSession(backend='memory')
cls.session = session_gbl
@classmethod
def tearDownClass(cls):
@ -34,11 +32,23 @@ class TestPriceHistory(unittest.TestCase):
f = df.index.time == _dt.time(0)
self.assertTrue(f.all())
def test_download(self):
tkrs = ["BHP.AX", "IMP.JO", "BP.L", "PNL.L", "INTC"]
intervals = ["1d", "1wk", "1mo"]
for interval in intervals:
df = yf.download(tkrs, period="5y", interval=interval)
f = df.index.time == _dt.time(0)
self.assertTrue(f.all())
df_tkrs = df.columns.levels[1]
self.assertEqual(sorted(tkrs), sorted(df_tkrs))
def test_duplicatingHourly(self):
tkrs = ["IMP.JO", "BHG.JO", "SSW.JO", "BP.L", "INTC"]
for tkr in tkrs:
dat = yf.Ticker(tkr, session=self.session)
tz = dat._get_ticker_tz(debug_mode=False, proxy=None, timeout=None)
tz = dat._get_ticker_tz(proxy=None, timeout=None)
dt_utc = _tz.timezone("UTC").localize(_dt.datetime.utcnow())
dt = dt_utc.astimezone(_tz.timezone(tz))
@ -58,7 +68,7 @@ class TestPriceHistory(unittest.TestCase):
test_run = False
for tkr in tkrs:
dat = yf.Ticker(tkr, session=self.session)
tz = dat._get_ticker_tz(debug_mode=False, proxy=None, timeout=None)
tz = dat._get_ticker_tz(proxy=None, timeout=None)
dt_utc = _tz.timezone("UTC").localize(_dt.datetime.utcnow())
dt = dt_utc.astimezone(_tz.timezone(tz))
@ -84,7 +94,7 @@ class TestPriceHistory(unittest.TestCase):
test_run = False
for tkr in tkrs:
dat = yf.Ticker(tkr, session=self.session)
tz = dat._get_ticker_tz(debug_mode=False, proxy=None, timeout=None)
tz = dat._get_ticker_tz(proxy=None, timeout=None)
dt = _tz.timezone(tz).localize(_dt.datetime.now())
if dt.date().weekday() not in [1, 2, 3, 4]:
@ -105,6 +115,32 @@ class TestPriceHistory(unittest.TestCase):
self.skipTest("Skipping test_duplicatingWeekly() because not possible to fail Monday/weekend")
def test_intraDayWithEvents(self):
tkrs = ["BHP.AX", "IMP.JO", "BP.L", "PNL.L", "INTC"]
test_run = False
for tkr in tkrs:
start_d = _dt.date.today() - _dt.timedelta(days=59)
end_d = None
df_daily = yf.Ticker(tkr, session=self.session).history(start=start_d, end=end_d, interval="1d", actions=True)
df_daily_divs = df_daily["Dividends"][df_daily["Dividends"] != 0]
if df_daily_divs.shape[0] == 0:
continue
last_div_date = df_daily_divs.index[-1]
start_d = last_div_date.date()
end_d = last_div_date.date() + _dt.timedelta(days=1)
df_intraday = yf.Ticker(tkr, session=self.session).history(start=start_d, end=end_d, interval="15m", actions=True)
self.assertTrue((df_intraday["Dividends"] != 0.0).any())
df_intraday_divs = df_intraday["Dividends"][df_intraday["Dividends"] != 0]
df_intraday_divs.index = df_intraday_divs.index.floor('D')
self.assertTrue(df_daily_divs.equals(df_intraday_divs))
test_run = True
if not test_run:
self.skipTest("Skipping test_intraDayWithEvents() because no tickers had a dividend in last 60 days")
def test_intraDayWithEvents_tase(self):
# TASE dividend release pre-market, doesn't merge nicely with intra-day data so check still present
tase_tkrs = ["ICL.TA", "ESLT.TA", "ONE.TA", "MGDL.TA"]
@ -115,21 +151,45 @@ class TestPriceHistory(unittest.TestCase):
df_daily = yf.Ticker(tkr, session=self.session).history(start=start_d, end=end_d, interval="1d", actions=True)
df_daily_divs = df_daily["Dividends"][df_daily["Dividends"] != 0]
if df_daily_divs.shape[0] == 0:
# self.skipTest("Skipping test_intraDayWithEvents() because 'ICL.TA' has no dividend in last 60 days")
continue
last_div_date = df_daily_divs.index[-1]
start_d = last_div_date.date()
end_d = last_div_date.date() + _dt.timedelta(days=1)
df = yf.Ticker(tkr, session=self.session).history(start=start_d, end=end_d, interval="15m", actions=True)
self.assertTrue((df["Dividends"] != 0.0).any())
df_intraday = yf.Ticker(tkr, session=self.session).history(start=start_d, end=end_d, interval="15m", actions=True)
self.assertTrue((df_intraday["Dividends"] != 0.0).any())
df_intraday_divs = df_intraday["Dividends"][df_intraday["Dividends"] != 0]
df_intraday_divs.index = df_intraday_divs.index.floor('D')
self.assertTrue(df_daily_divs.equals(df_intraday_divs))
test_run = True
break
if not test_run:
self.skipTest("Skipping test_intraDayWithEvents() because no tickers had a dividend in last 60 days")
self.skipTest("Skipping test_intraDayWithEvents_tase() because no tickers had a dividend in last 60 days")
def test_dailyWithEvents(self):
start_d = _dt.date(2022, 1, 1)
end_d = _dt.date(2023, 1, 1)
tkr_div_dates = {}
tkr_div_dates['BHP.AX'] = [_dt.date(2022, 9, 1), _dt.date(2022, 2, 24)] # Yahoo claims 23-Feb but wrong because DST
tkr_div_dates['IMP.JO'] = [_dt.date(2022, 9, 21), _dt.date(2022, 3, 16)]
tkr_div_dates['BP.L'] = [_dt.date(2022, 11, 10), _dt.date(2022, 8, 11), _dt.date(2022, 5, 12), _dt.date(2022, 2, 17)]
tkr_div_dates['INTC'] = [_dt.date(2022, 11, 4), _dt.date(2022, 8, 4), _dt.date(2022, 5, 5), _dt.date(2022, 2, 4)]
for tkr,dates in tkr_div_dates.items():
df = yf.Ticker(tkr, session=self.session).history(interval='1d', start=start_d, end=end_d)
df_divs = df[df['Dividends']!=0].sort_index(ascending=False)
try:
self.assertTrue((df_divs.index.date == dates).all())
except:
print(f'- ticker = {tkr}')
print('- response:') ; print(df_divs.index.date)
print('- answer:') ; print(dates)
raise
def test_dailyWithEvents_bugs(self):
# Reproduce issue #521
tkr1 = "QQQ"
tkr2 = "GDX"
@ -163,6 +223,60 @@ class TestPriceHistory(unittest.TestCase):
print("{}-without-events missing these dates: {}".format(tkr, missing_from_df2))
raise
def test_intraDayWithEvents(self):
tkrs = ["BHP.AX", "IMP.JO", "BP.L", "PNL.L", "INTC"]
test_run = False
for tkr in tkrs:
start_d = _dt.date.today() - _dt.timedelta(days=59)
end_d = None
df_daily = yf.Ticker(tkr, session=self.session).history(start=start_d, end=end_d, interval="1d", actions=True)
df_daily_divs = df_daily["Dividends"][df_daily["Dividends"] != 0]
if df_daily_divs.shape[0] == 0:
continue
last_div_date = df_daily_divs.index[-1]
start_d = last_div_date.date()
end_d = last_div_date.date() + _dt.timedelta(days=1)
df_intraday = yf.Ticker(tkr, session=self.session).history(start=start_d, end=end_d, interval="15m", actions=True)
self.assertTrue((df_intraday["Dividends"] != 0.0).any())
df_intraday_divs = df_intraday["Dividends"][df_intraday["Dividends"] != 0]
df_intraday_divs.index = df_intraday_divs.index.floor('D')
self.assertTrue(df_daily_divs.equals(df_intraday_divs))
test_run = True
if not test_run:
self.skipTest("Skipping test_intraDayWithEvents() because no tickers had a dividend in last 60 days")
def test_intraDayWithEvents_tase(self):
# TASE dividend release pre-market, doesn't merge nicely with intra-day data so check still present
tase_tkrs = ["ICL.TA", "ESLT.TA", "ONE.TA", "MGDL.TA"]
test_run = False
for tkr in tase_tkrs:
start_d = _dt.date.today() - _dt.timedelta(days=59)
end_d = None
df_daily = yf.Ticker(tkr, session=self.session).history(start=start_d, end=end_d, interval="1d", actions=True)
df_daily_divs = df_daily["Dividends"][df_daily["Dividends"] != 0]
if df_daily_divs.shape[0] == 0:
continue
last_div_date = df_daily_divs.index[-1]
start_d = last_div_date.date()
end_d = last_div_date.date() + _dt.timedelta(days=1)
df_intraday = yf.Ticker(tkr, session=self.session).history(start=start_d, end=end_d, interval="15m", actions=True)
self.assertTrue((df_intraday["Dividends"] != 0.0).any())
df_intraday_divs = df_intraday["Dividends"][df_intraday["Dividends"] != 0]
df_intraday_divs.index = df_intraday_divs.index.floor('D')
self.assertTrue(df_daily_divs.equals(df_intraday_divs))
test_run = True
if not test_run:
self.skipTest("Skipping test_intraDayWithEvents_tase() because no tickers had a dividend in last 60 days")
def test_weeklyWithEvents(self):
# Reproduce issue #521
tkr1 = "QQQ"
@ -230,6 +344,22 @@ class TestPriceHistory(unittest.TestCase):
print("{}-without-events missing these dates: {}".format(tkr, missing_from_df2))
raise
def test_monthlyWithEvents2(self):
# Simply check no exception from internal merge
dfm = yf.Ticker("ABBV").history(period="max", interval="1mo")
dfd = yf.Ticker("ABBV").history(period="max", interval="1d")
dfd = dfd[dfd.index > dfm.index[0]]
dfm_divs = dfm[dfm['Dividends']!=0]
dfd_divs = dfd[dfd['Dividends']!=0]
self.assertEqual(dfm_divs.shape[0], dfd_divs.shape[0])
dfm = yf.Ticker("F").history(period="50mo",interval="1mo")
dfd = yf.Ticker("F").history(period="50mo", interval="1d")
dfd = dfd[dfd.index > dfm.index[0]]
dfm_divs = dfm[dfm['Dividends']!=0]
dfd_divs = dfd[dfd['Dividends']!=0]
self.assertEqual(dfm_divs.shape[0], dfd_divs.shape[0])
def test_tz_dst_ambiguous(self):
# Reproduce issue #1100
try:
@ -381,12 +511,22 @@ class TestPriceHistory(unittest.TestCase):
df = dat.history(start=start, interval="1wk")
self.assertTrue((df.index.weekday == 0).all())
def test_aggregate_capital_gains(self):
# Setup
tkr = "FXAIX"
dat = yf.Ticker(tkr, session=self.session)
start = "2017-12-31"
end = "2019-12-31"
interval = "3mo"
df = dat.history(start=start, end=end, interval=interval)
class TestPriceRepair(unittest.TestCase):
session = None
@classmethod
def setUpClass(cls):
cls.session = requests_cache.CachedSession(backend='memory')
cls.session = session_gbl
@classmethod
def tearDownClass(cls):
@ -413,7 +553,7 @@ class TestPriceRepair(unittest.TestCase):
start_dt = end_dt - td_60d
df = dat.history(start=start_dt, end=end_dt, interval="2m", repair=True)
def test_repair_100x_weekly(self):
def test_repair_100x_random_weekly(self):
# Setup:
tkr = "PNL.L"
dat = yf.Ticker(tkr, session=self.session)
@ -441,7 +581,7 @@ class TestPriceRepair(unittest.TestCase):
# Run test
df_repaired = dat._fix_unit_mixups(df_bad, "1wk", tz_exchange, prepost=False)
df_repaired = dat._fix_unit_random_mixups(df_bad, "1wk", tz_exchange, prepost=False, silent=True)
# First test - no errors left
for c in data_cols:
@ -464,7 +604,10 @@ class TestPriceRepair(unittest.TestCase):
f_1 = ratio == 1
self.assertTrue((f_100 | f_1).all())
def test_repair_100x_weekly_preSplit(self):
self.assertTrue("Repaired?" in df_repaired.columns)
self.assertFalse(df_repaired["Repaired?"].isna().any())
def test_repair_100x_random_weekly_preSplit(self):
# PNL.L has a stock-split in 2022. Sometimes requesting data before 2022 is not split-adjusted.
tkr = "PNL.L"
@ -496,7 +639,7 @@ class TestPriceRepair(unittest.TestCase):
df.index = df.index.tz_localize(tz_exchange)
df_bad.index = df_bad.index.tz_localize(tz_exchange)
df_repaired = dat._fix_unit_mixups(df_bad, "1wk", tz_exchange, prepost=False)
df_repaired = dat._fix_unit_random_mixups(df_bad, "1wk", tz_exchange, prepost=False, silent=True)
# First test - no errors left
for c in data_cols:
@ -521,7 +664,10 @@ class TestPriceRepair(unittest.TestCase):
f_1 = ratio == 1
self.assertTrue((f_100 | f_1).all())
def test_repair_100x_daily(self):
self.assertTrue("Repaired?" in df_repaired.columns)
self.assertFalse(df_repaired["Repaired?"].isna().any())
def test_repair_100x_random_daily(self):
tkr = "PNL.L"
dat = yf.Ticker(tkr, session=self.session)
tz_exchange = dat.fast_info["timezone"]
@ -546,7 +692,7 @@ class TestPriceRepair(unittest.TestCase):
df.index = df.index.tz_localize(tz_exchange)
df_bad.index = df_bad.index.tz_localize(tz_exchange)
df_repaired = dat._fix_unit_mixups(df_bad, "1d", tz_exchange, prepost=False)
df_repaired = dat._fix_unit_random_mixups(df_bad, "1d", tz_exchange, prepost=False, silent=True)
# First test - no errors left
for c in data_cols:
@ -563,6 +709,54 @@ class TestPriceRepair(unittest.TestCase):
f_1 = ratio == 1
self.assertTrue((f_100 | f_1).all())
self.assertTrue("Repaired?" in df_repaired.columns)
self.assertFalse(df_repaired["Repaired?"].isna().any())
def test_repair_100x_block_daily(self):
# Some 100x errors are not sporadic.
# Sometimes Yahoo suddenly shifts from cents->$ from some recent date.
tkr = "SSW.JO"
dat = yf.Ticker(tkr, session=self.session)
tz_exchange = dat.fast_info["timezone"]
data_cols = ["Low", "High", "Open", "Close", "Adj Close"]
_dp = os.path.dirname(__file__)
df_bad = _pd.read_csv(os.path.join(_dp, "data", tkr.replace('.','-')+"-100x-error.csv"), index_col="Date")
df_bad.index = _pd.to_datetime(df_bad.index)
df_bad = df_bad.sort_index()
df = df_bad.copy()
for d in data_cols:
df.loc[:'2023-05-31', d] *= 0.01 # fix error
df_repaired = dat._fix_unit_switch(df_bad, "1d", tz_exchange)
df_repaired = df_repaired.sort_index()
# First test - no errors left
for c in data_cols:
try:
self.assertTrue(_np.isclose(df_repaired[c], df[c], rtol=1e-2).all())
except:
print(df_repaired[c])
print(df[c])
print(f"TEST FAIL on column '{c}")
raise
# Second test - all differences should be either ~1x or ~100x
ratio = df_bad[data_cols].values / df[data_cols].values
ratio = ratio.round(2)
# - round near-100 ratio to 100:
f = ratio > 90
ratio[f] = (ratio[f] / 10).round().astype(int) * 10 # round ratio to nearest 10
# - now test
f_100 = ratio == 100
f_1 = ratio == 1
self.assertTrue((f_100 | f_1).all())
self.assertTrue("Repaired?" in df_repaired.columns)
self.assertFalse(df_repaired["Repaired?"].isna().any())
def test_repair_zeroes_daily(self):
tkr = "BBIL.L"
dat = yf.Ticker(tkr, session=self.session)
@ -590,6 +784,45 @@ class TestPriceRepair(unittest.TestCase):
for c in ["Open", "Low", "High", "Close"]:
self.assertTrue(_np.isclose(repaired_df[c], correct_df[c], rtol=1e-8).all())
self.assertTrue("Repaired?" in repaired_df.columns)
self.assertFalse(repaired_df["Repaired?"].isna().any())
def test_repair_zeroes_daily_adjClose(self):
# Test that 'Adj Close' is reconstructed correctly,
# particularly when a dividend occurred within 1 day.
tkr = "INTC"
df = _pd.DataFrame(data={"Open": [28.95, 28.65, 29.55, 29.62, 29.25],
"High": [29.12, 29.27, 29.65, 31.17, 30.30],
"Low": [28.21, 28.43, 28.61, 29.53, 28.80],
"Close": [28.24, 29.05, 28.69, 30.32, 30.19],
"Adj Close": [28.12, 28.93, 28.57, 29.83, 29.70],
"Volume": [36e6, 51e6, 49e6, 58e6, 62e6],
"Dividends": [0, 0, 0.365, 0, 0]},
index=_pd.to_datetime([_dt.datetime(2023, 2, 8),
_dt.datetime(2023, 2, 7),
_dt.datetime(2023, 2, 6),
_dt.datetime(2023, 2, 3),
_dt.datetime(2023, 2, 2)]))
df = df.sort_index()
df.index.name = "Date"
dat = yf.Ticker(tkr, session=self.session)
tz_exchange = dat.fast_info["timezone"]
df.index = df.index.tz_localize(tz_exchange)
rtol = 5e-3
for i in [0, 1, 2]:
df_slice = df.iloc[i:i+3]
for j in range(3):
df_slice_bad = df_slice.copy()
df_slice_bad.loc[df_slice_bad.index[j], "Adj Close"] = 0.0
df_slice_bad_repaired = dat._fix_zeroes(df_slice_bad, "1d", tz_exchange, prepost=False)
for c in ["Close", "Adj Close"]:
self.assertTrue(_np.isclose(df_slice_bad_repaired[c], df_slice[c], rtol=rtol).all())
self.assertTrue("Repaired?" in df_slice_bad_repaired.columns)
self.assertFalse(df_slice_bad_repaired["Repaired?"].isna().any())
def test_repair_zeroes_hourly(self):
tkr = "INTC"
dat = yf.Ticker(tkr, session=self.session)
@ -621,13 +854,68 @@ class TestPriceRepair(unittest.TestCase):
print(repaired_df[c] - correct_df[c])
raise
self.assertTrue("Repaired?" in repaired_df.columns)
self.assertFalse(repaired_df["Repaired?"].isna().any())
def test_repair_bad_stock_split(self):
bad_tkrs = ['4063.T', 'ALPHA.PA', 'CNE.L', 'MOB.ST', 'SPM.MI']
for tkr in bad_tkrs:
dat = yf.Ticker(tkr, session=self.session)
tz_exchange = dat.fast_info["timezone"]
_dp = os.path.dirname(__file__)
df_bad = _pd.read_csv(os.path.join(_dp, "data", tkr.replace('.','-')+"-bad-stock-split.csv"), index_col="Date")
df_bad.index = _pd.to_datetime(df_bad.index)
repaired_df = dat._fix_bad_stock_split(df_bad, "1d", tz_exchange)
correct_df = _pd.read_csv(os.path.join(_dp, "data", tkr.replace('.','-')+"-bad-stock-split-fixed.csv"), index_col="Date")
correct_df.index = _pd.to_datetime(correct_df.index)
repaired_df = repaired_df.sort_index()
correct_df = correct_df.sort_index()
for c in ["Open", "Low", "High", "Close", "Adj Close", "Volume"]:
try:
self.assertTrue(_np.isclose(repaired_df[c], correct_df[c], rtol=5e-6).all())
except:
print(f"tkr={tkr} COLUMN={c}")
print("- repaired_df")
print(repaired_df)
print("- correct_df[c]:")
print(correct_df[c])
print("- diff:")
print(repaired_df[c] - correct_df[c])
raise
# Stocks that split in 2022 but no problems in Yahoo data,
# so repair should change nothing
good_tkrs = ['AMZN', 'DXCM', 'FTNT', 'GOOG', 'GME', 'PANW', 'SHOP', 'TSLA']
good_tkrs += ['AEI', 'CHRA', 'GHI', 'IRON', 'LXU', 'NUZE', 'RSLS', 'TISI']
good_tkrs += ['BOL.ST', 'TUI1.DE']
intervals = ['1d', '1wk', '1mo', '3mo']
for tkr in good_tkrs:
for interval in intervals:
dat = yf.Ticker(tkr, session=self.session)
tz_exchange = dat.fast_info["timezone"]
_dp = os.path.dirname(__file__)
df_good = dat.history(period='2y', interval=interval, auto_adjust=False)
repaired_df = dat._fix_bad_stock_split(df_good, interval, tz_exchange)
# Expect no change from repair
df_good = df_good.sort_index()
repaired_df = repaired_df.sort_index()
for c in ["Open", "Low", "High", "Close", "Adj Close", "Volume"]:
try:
self.assertTrue((repaired_df[c].to_numpy() == df_good[c].to_numpy()).all())
except:
print(f"tkr={tkr} interval={interval} COLUMN={c}")
df_dbg = df_good[[c]].join(repaired_df[[c]], lsuffix='.good', rsuffix='.repaired')
f_diff = repaired_df[c].to_numpy() != df_good[c].to_numpy()
print(df_dbg[f_diff | _np.roll(f_diff, 1) | _np.roll(f_diff, -1)])
raise
if __name__ == '__main__':
unittest.main()
# # Run tests sequentially:
# import inspect
# test_src = inspect.getsource(TestPriceHistory)
# unittest.TestLoader.sortTestMethodsUsing = lambda _, x, y: (
# test_src.index(f"def {x}") - test_src.index(f"def {y}")
# )
# unittest.main(verbosity=2)

View File

@ -12,25 +12,18 @@ import pandas as pd
import numpy as np
from .context import yfinance as yf
from .context import session_gbl
import unittest
import requests_cache
# Set this to see the exact requests that are made during tests
DEBUG_LOG_REQUESTS = False
if DEBUG_LOG_REQUESTS:
import logging
logging.basicConfig(level=logging.DEBUG)
class TestTicker(unittest.TestCase):
session = None
@classmethod
def setUpClass(cls):
cls.session = requests_cache.CachedSession(backend='memory')
cls.session = session_gbl
cls.proxy = None
@ -47,19 +40,27 @@ class TestTicker(unittest.TestCase):
# Test:
dat = yf.Ticker(tkr, session=self.session)
tz = dat._get_ticker_tz(proxy=None, timeout=None, debug_mode=False, raise_errors=False)
tz = dat._get_ticker_tz(proxy=None, timeout=None)
self.assertIsNotNone(tz)
def test_badTicker(self):
# Check yfinance doesn't die when ticker delisted
tkr = "AM2Z.TA"
tkr = "DJI" # typo of "^DJI"
dat = yf.Ticker(tkr, session=self.session)
dat.history(period="1wk")
dat.history(start="2022-01-01")
dat.history(start="2022-01-01", end="2022-03-01")
yf.download([tkr], period="1wk")
yf.download([tkr], period="1wk", threads=False, ignore_tz=False)
yf.download([tkr], period="1wk", threads=True, ignore_tz=False)
yf.download([tkr], period="1wk", threads=False, ignore_tz=True)
yf.download([tkr], period="1wk", threads=True, ignore_tz=True)
for k in dat.fast_info:
dat.fast_info[k]
dat.isin
dat.major_holders
dat.institutional_holders
@ -67,69 +68,83 @@ class TestTicker(unittest.TestCase):
dat.dividends
dat.splits
dat.actions
dat.shares
dat.get_shares_full()
dat.info
dat.calendar
dat.recommendations
dat.earnings
dat.quarterly_earnings
dat.options
dat.news
dat.earnings_dates
dat.income_stmt
dat.quarterly_income_stmt
dat.balance_sheet
dat.quarterly_balance_sheet
dat.cashflow
dat.quarterly_cashflow
dat.recommendations_summary
dat.analyst_price_target
dat.revenue_forecasts
dat.sustainability
dat.options
dat.news
dat.earnings_trend
dat.earnings_dates
dat.earnings_forecasts
# These haven't been ported Yahoo API
# dat.shares
# dat.info
# dat.calendar
# dat.recommendations
# dat.earnings
# dat.quarterly_earnings
# dat.recommendations_summary
# dat.analyst_price_target
# dat.revenue_forecasts
# dat.sustainability
# dat.earnings_trend
# dat.earnings_forecasts
def test_goodTicker(self):
# that yfinance works when full api is called on same instance of ticker
tkr = "IBM"
dat = yf.Ticker(tkr, session=self.session)
tkrs = ["IBM"]
tkrs.append("QCSTIX") # weird ticker, no price history but has previous close
for tkr in tkrs:
dat = yf.Ticker(tkr, session=self.session)
dat.isin
dat.major_holders
dat.institutional_holders
dat.mutualfund_holders
dat.dividends
dat.splits
dat.actions
dat.shares
dat.get_shares_full()
dat.info
dat.calendar
dat.recommendations
dat.earnings
dat.quarterly_earnings
dat.income_stmt
dat.quarterly_income_stmt
dat.balance_sheet
dat.quarterly_balance_sheet
dat.cashflow
dat.quarterly_cashflow
dat.recommendations_summary
dat.analyst_price_target
dat.revenue_forecasts
dat.sustainability
dat.options
dat.news
dat.earnings_trend
dat.earnings_dates
dat.earnings_forecasts
dat.history(period="1wk")
dat.history(start="2022-01-01")
dat.history(start="2022-01-01", end="2022-03-01")
yf.download([tkr], period="1wk", threads=False, ignore_tz=False)
yf.download([tkr], period="1wk", threads=True, ignore_tz=False)
yf.download([tkr], period="1wk", threads=False, ignore_tz=True)
yf.download([tkr], period="1wk", threads=True, ignore_tz=True)
dat.history(period="1wk")
dat.history(start="2022-01-01")
dat.history(start="2022-01-01", end="2022-03-01")
yf.download([tkr], period="1wk")
for k in dat.fast_info:
dat.fast_info[k]
dat.isin
dat.major_holders
dat.institutional_holders
dat.mutualfund_holders
dat.dividends
dat.splits
dat.actions
dat.get_shares_full()
dat.options
dat.news
dat.earnings_dates
dat.income_stmt
dat.quarterly_income_stmt
dat.balance_sheet
dat.quarterly_balance_sheet
dat.cashflow
dat.quarterly_cashflow
# These require decryption which is broken:
# dat.shares
# dat.info
# dat.calendar
# dat.recommendations
# dat.earnings
# dat.quarterly_earnings
# dat.recommendations_summary
# dat.analyst_price_target
# dat.revenue_forecasts
# dat.sustainability
# dat.earnings_trend
# dat.earnings_forecasts
def test_goodTicker_withProxy(self):
# that yfinance works when full api is called on same instance of ticker
@ -260,7 +275,7 @@ class TestTickerHistory(unittest.TestCase):
@classmethod
def setUpClass(cls):
cls.session = requests_cache.CachedSession(backend='memory')
cls.session = session_gbl
@classmethod
def tearDownClass(cls):
@ -269,19 +284,28 @@ class TestTickerHistory(unittest.TestCase):
def setUp(self):
# use a ticker that has dividends
self.ticker = yf.Ticker("IBM", session=self.session)
self.symbol = "IBM"
self.ticker = yf.Ticker(self.symbol, session=self.session)
self.symbols = ["AMZN", "MSFT", "NVDA"]
def tearDown(self):
self.ticker = None
def test_history(self):
with self.assertRaises(RuntimeError):
self.ticker.history_metadata
md = self.ticker.history_metadata
self.assertIn("IBM", md.values(), "metadata missing")
data = self.ticker.history("1y")
self.assertIn("IBM", self.ticker.history_metadata.values(), "metadata missing")
self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
self.assertFalse(data.empty, "data is empty")
def test_download(self):
for t in [False, True]:
for i in [False, True]:
data = yf.download(self.symbols, threads=t, ignore_tz=i)
self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
self.assertFalse(data.empty, "data is empty")
def test_no_expensive_calls_introduced(self):
"""
Make sure calling history to get price data has not introduced more calls to yahoo than absolutely necessary.
@ -294,7 +318,7 @@ class TestTickerHistory(unittest.TestCase):
actual_urls_called = tuple([r.url for r in session.cache.filter()])
session.close()
expected_urls = (
'https://query2.finance.yahoo.com/v8/finance/chart/GOOGL?range=1y&interval=1d&includePrePost=False&events=div%2Csplits%2CcapitalGains',
'https://query2.finance.yahoo.com/v8/finance/chart/GOOGL?events=div,splits,capitalGains&includePrePost=False&interval=1d&range=1y',
)
self.assertEqual(expected_urls, actual_urls_called, "Different than expected url used to fetch history.")
@ -314,75 +338,76 @@ class TestTickerHistory(unittest.TestCase):
self.assertFalse(data.empty, "data is empty")
class TestTickerEarnings(unittest.TestCase):
session = None
# Below will fail because not ported to Yahoo API
# class TestTickerEarnings(unittest.TestCase):
# session = None
@classmethod
def setUpClass(cls):
cls.session = requests_cache.CachedSession(backend='memory')
# @classmethod
# def setUpClass(cls):
# cls.session = session_gbl
@classmethod
def tearDownClass(cls):
if cls.session is not None:
cls.session.close()
# @classmethod
# def tearDownClass(cls):
# if cls.session is not None:
# cls.session.close()
def setUp(self):
self.ticker = yf.Ticker("GOOGL", session=self.session)
# def setUp(self):
# self.ticker = yf.Ticker("GOOGL", session=self.session)
def tearDown(self):
self.ticker = None
# def tearDown(self):
# self.ticker = None
def test_earnings(self):
data = self.ticker.earnings
self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
self.assertFalse(data.empty, "data is empty")
# def test_earnings(self):
# data = self.ticker.earnings
# self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
# self.assertFalse(data.empty, "data is empty")
data_cached = self.ticker.earnings
self.assertIs(data, data_cached, "data not cached")
# data_cached = self.ticker.earnings
# self.assertIs(data, data_cached, "data not cached")
def test_quarterly_earnings(self):
data = self.ticker.quarterly_earnings
self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
self.assertFalse(data.empty, "data is empty")
# def test_quarterly_earnings(self):
# data = self.ticker.quarterly_earnings
# self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
# self.assertFalse(data.empty, "data is empty")
data_cached = self.ticker.quarterly_earnings
self.assertIs(data, data_cached, "data not cached")
# data_cached = self.ticker.quarterly_earnings
# self.assertIs(data, data_cached, "data not cached")
def test_earnings_forecasts(self):
data = self.ticker.earnings_forecasts
self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
self.assertFalse(data.empty, "data is empty")
# def test_earnings_forecasts(self):
# data = self.ticker.earnings_forecasts
# self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
# self.assertFalse(data.empty, "data is empty")
data_cached = self.ticker.earnings_forecasts
self.assertIs(data, data_cached, "data not cached")
# data_cached = self.ticker.earnings_forecasts
# self.assertIs(data, data_cached, "data not cached")
def test_earnings_dates(self):
data = self.ticker.earnings_dates
self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
self.assertFalse(data.empty, "data is empty")
# def test_earnings_dates(self):
# data = self.ticker.earnings_dates
# self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
# self.assertFalse(data.empty, "data is empty")
data_cached = self.ticker.earnings_dates
self.assertIs(data, data_cached, "data not cached")
# data_cached = self.ticker.earnings_dates
# self.assertIs(data, data_cached, "data not cached")
def test_earnings_trend(self):
data = self.ticker.earnings_trend
self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
self.assertFalse(data.empty, "data is empty")
# def test_earnings_trend(self):
# data = self.ticker.earnings_trend
# self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
# self.assertFalse(data.empty, "data is empty")
data_cached = self.ticker.earnings_trend
self.assertIs(data, data_cached, "data not cached")
# data_cached = self.ticker.earnings_trend
# self.assertIs(data, data_cached, "data not cached")
def test_earnings_dates_with_limit(self):
# use ticker with lots of historic earnings
ticker = yf.Ticker("IBM")
limit = 110
data = ticker.get_earnings_dates(limit=limit)
self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
self.assertFalse(data.empty, "data is empty")
self.assertEqual(len(data), limit, "Wrong number or rows")
# def test_earnings_dates_with_limit(self):
# # use ticker with lots of historic earnings
# ticker = yf.Ticker("IBM")
# limit = 110
# data = ticker.get_earnings_dates(limit=limit)
# self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
# self.assertFalse(data.empty, "data is empty")
# self.assertEqual(len(data), limit, "Wrong number or rows")
data_cached = ticker.get_earnings_dates(limit=limit)
self.assertIs(data, data_cached, "data not cached")
# data_cached = ticker.get_earnings_dates(limit=limit)
# self.assertIs(data, data_cached, "data not cached")
class TestTickerHolders(unittest.TestCase):
@ -390,7 +415,7 @@ class TestTickerHolders(unittest.TestCase):
@classmethod
def setUpClass(cls):
cls.session = requests_cache.CachedSession(backend='memory')
cls.session = session_gbl
@classmethod
def tearDownClass(cls):
@ -433,7 +458,7 @@ class TestTickerMiscFinancials(unittest.TestCase):
@classmethod
def setUpClass(cls):
cls.session = requests_cache.CachedSession(backend='memory')
cls.session = session_gbl
@classmethod
def tearDownClass(cls):
@ -451,6 +476,24 @@ class TestTickerMiscFinancials(unittest.TestCase):
def tearDown(self):
self.ticker = None
def test_isin(self):
data = self.ticker.isin
self.assertIsInstance(data, str, "data has wrong type")
self.assertEqual("ARDEUT116159", data, "data is empty")
data_cached = self.ticker.isin
self.assertIs(data, data_cached, "data not cached")
def test_options(self):
data = self.ticker.options
self.assertIsInstance(data, tuple, "data has wrong type")
self.assertTrue(len(data) > 1, "data is empty")
def test_shares_full(self):
data = self.ticker.get_shares_full()
self.assertIsInstance(data, pd.Series, "data has wrong type")
self.assertFalse(data.empty, "data is empty")
def test_income_statement(self):
expected_keys = ["Total Revenue", "Basic EPS"]
expected_periods_days = 365
@ -480,7 +523,6 @@ class TestTickerMiscFinancials(unittest.TestCase):
data = self.ticker.get_income_stmt(as_dict=True)
self.assertIsInstance(data, dict, "data has wrong type")
def test_quarterly_income_statement(self):
expected_keys = ["Total Revenue", "Basic EPS"]
expected_periods_days = 365//4
@ -510,16 +552,6 @@ class TestTickerMiscFinancials(unittest.TestCase):
data = self.ticker.get_income_stmt(as_dict=True)
self.assertIsInstance(data, dict, "data has wrong type")
def test_quarterly_income_statement_old_fmt(self):
expected_row = "TotalRevenue"
data = self.ticker_old_fmt.get_income_stmt(freq="quarterly", legacy=True)
self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
self.assertFalse(data.empty, "data is empty")
self.assertIn(expected_row, data.index, "Did not find expected row in index")
data_cached = self.ticker_old_fmt.get_income_stmt(freq="quarterly", legacy=True)
self.assertIs(data, data_cached, "data not cached")
def test_balance_sheet(self):
expected_keys = ["Total Assets", "Net PPE"]
expected_periods_days = 365
@ -578,16 +610,6 @@ class TestTickerMiscFinancials(unittest.TestCase):
data = self.ticker.get_balance_sheet(as_dict=True, freq="quarterly")
self.assertIsInstance(data, dict, "data has wrong type")
def test_quarterly_balance_sheet_old_fmt(self):
expected_row = "TotalAssets"
data = self.ticker_old_fmt.get_balance_sheet(freq="quarterly", legacy=True)
self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
self.assertFalse(data.empty, "data is empty")
self.assertIn(expected_row, data.index, "Did not find expected row in index")
data_cached = self.ticker_old_fmt.get_balance_sheet(freq="quarterly", legacy=True)
self.assertIs(data, data_cached, "data not cached")
def test_cash_flow(self):
expected_keys = ["Operating Cash Flow", "Net PPE Purchase And Sale"]
expected_periods_days = 365
@ -646,16 +668,6 @@ class TestTickerMiscFinancials(unittest.TestCase):
data = self.ticker.get_cashflow(as_dict=True)
self.assertIsInstance(data, dict, "data has wrong type")
def test_quarterly_cashflow_old_fmt(self):
expected_row = "NetIncome"
data = self.ticker_old_fmt.get_cashflow(legacy=True, freq="quarterly")
self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
self.assertFalse(data.empty, "data is empty")
self.assertIn(expected_row, data.index, "Did not find expected row in index")
data_cached = self.ticker_old_fmt.get_cashflow(legacy=True, freq="quarterly")
self.assertIs(data, data_cached, "data not cached")
def test_income_alt_names(self):
i1 = self.ticker.income_stmt
i2 = self.ticker.incomestmt
@ -715,87 +727,71 @@ class TestTickerMiscFinancials(unittest.TestCase):
i2 = self.ticker.get_cashflow(freq="quarterly")
self.assertTrue(i1.equals(i2))
def test_sustainability(self):
data = self.ticker.sustainability
self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
self.assertFalse(data.empty, "data is empty")
data_cached = self.ticker.sustainability
self.assertIs(data, data_cached, "data not cached")
def test_recommendations(self):
data = self.ticker.recommendations
self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
self.assertFalse(data.empty, "data is empty")
data_cached = self.ticker.recommendations
self.assertIs(data, data_cached, "data not cached")
def test_recommendations_summary(self):
data = self.ticker.recommendations_summary
self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
self.assertFalse(data.empty, "data is empty")
data_cached = self.ticker.recommendations_summary
self.assertIs(data, data_cached, "data not cached")
def test_analyst_price_target(self):
data = self.ticker.analyst_price_target
self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
self.assertFalse(data.empty, "data is empty")
data_cached = self.ticker.analyst_price_target
self.assertIs(data, data_cached, "data not cached")
def test_revenue_forecasts(self):
data = self.ticker.revenue_forecasts
self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
self.assertFalse(data.empty, "data is empty")
data_cached = self.ticker.revenue_forecasts
self.assertIs(data, data_cached, "data not cached")
def test_calendar(self):
data = self.ticker.calendar
self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
self.assertFalse(data.empty, "data is empty")
data_cached = self.ticker.calendar
self.assertIs(data, data_cached, "data not cached")
def test_isin(self):
data = self.ticker.isin
self.assertIsInstance(data, str, "data has wrong type")
self.assertEqual("ARDEUT116159", data, "data is empty")
data_cached = self.ticker.isin
self.assertIs(data, data_cached, "data not cached")
def test_options(self):
data = self.ticker.options
self.assertIsInstance(data, tuple, "data has wrong type")
self.assertTrue(len(data) > 1, "data is empty")
def test_shares(self):
data = self.ticker.shares
self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
self.assertFalse(data.empty, "data is empty")
def test_shares_full(self):
data = self.ticker.get_shares_full()
self.assertIsInstance(data, pd.Series, "data has wrong type")
self.assertFalse(data.empty, "data is empty")
def test_bad_freq_value_raises_exception(self):
self.assertRaises(ValueError, lambda: self.ticker.get_cashflow(freq="badarg"))
# Below will fail because not ported to Yahoo API
# def test_sustainability(self):
# data = self.ticker.sustainability
# self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
# self.assertFalse(data.empty, "data is empty")
# data_cached = self.ticker.sustainability
# self.assertIs(data, data_cached, "data not cached")
# def test_recommendations(self):
# data = self.ticker.recommendations
# self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
# self.assertFalse(data.empty, "data is empty")
# data_cached = self.ticker.recommendations
# self.assertIs(data, data_cached, "data not cached")
# def test_recommendations_summary(self):
# data = self.ticker.recommendations_summary
# self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
# self.assertFalse(data.empty, "data is empty")
# data_cached = self.ticker.recommendations_summary
# self.assertIs(data, data_cached, "data not cached")
# def test_analyst_price_target(self):
# data = self.ticker.analyst_price_target
# self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
# self.assertFalse(data.empty, "data is empty")
# data_cached = self.ticker.analyst_price_target
# self.assertIs(data, data_cached, "data not cached")
# def test_revenue_forecasts(self):
# data = self.ticker.revenue_forecasts
# self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
# self.assertFalse(data.empty, "data is empty")
# data_cached = self.ticker.revenue_forecasts
# self.assertIs(data, data_cached, "data not cached")
# def test_calendar(self):
# data = self.ticker.calendar
# self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
# self.assertFalse(data.empty, "data is empty")
# data_cached = self.ticker.calendar
# self.assertIs(data, data_cached, "data not cached")
# def test_shares(self):
# data = self.ticker.shares
# self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
# self.assertFalse(data.empty, "data is empty")
class TestTickerInfo(unittest.TestCase):
session = None
@classmethod
def setUpClass(cls):
cls.session = requests_cache.CachedSession(backend='memory')
cls.session = session_gbl
@classmethod
def tearDownClass(cls):
@ -813,110 +809,116 @@ class TestTickerInfo(unittest.TestCase):
def tearDown(self):
self.ticker = None
def test_fast_info(self):
f = yf.Ticker("AAPL", session=self.session).fast_info
for k in f:
self.assertIsNotNone(f[k])
def test_info(self):
data = self.tickers[0].info
self.assertIsInstance(data, dict, "data has wrong type")
self.assertIn("symbol", data.keys(), "Did not find expected key in info dict")
expected_keys = ['industry', 'currentPrice', 'exchange', 'floatShares', 'companyOfficers', 'bid']
for k in expected_keys:
print(k)
self.assertIn("symbol", data.keys(), f"Did not find expected key '{k}' in info dict")
self.assertEqual(self.symbols[0], data["symbol"], "Wrong symbol value in info dict")
def test_fast_info(self):
yf.scrapers.quote.PRUNE_INFO = False
# def test_fast_info_matches_info(self):
# fast_info_keys = set()
# for ticker in self.tickers:
# fast_info_keys.update(set(ticker.fast_info.keys()))
# fast_info_keys = sorted(list(fast_info_keys))
fast_info_keys = set()
for ticker in self.tickers:
fast_info_keys.update(set(ticker.fast_info.keys()))
fast_info_keys = sorted(list(fast_info_keys))
# key_rename_map = {}
# key_rename_map["currency"] = "currency"
# key_rename_map["quote_type"] = "quoteType"
# key_rename_map["timezone"] = "exchangeTimezoneName"
key_rename_map = {}
key_rename_map["currency"] = "currency"
key_rename_map["quote_type"] = "quoteType"
key_rename_map["timezone"] = "exchangeTimezoneName"
# key_rename_map["last_price"] = ["currentPrice", "regularMarketPrice"]
# key_rename_map["open"] = ["open", "regularMarketOpen"]
# key_rename_map["day_high"] = ["dayHigh", "regularMarketDayHigh"]
# key_rename_map["day_low"] = ["dayLow", "regularMarketDayLow"]
# key_rename_map["previous_close"] = ["previousClose"]
# key_rename_map["regular_market_previous_close"] = ["regularMarketPreviousClose"]
key_rename_map["last_price"] = ["currentPrice", "regularMarketPrice"]
key_rename_map["open"] = ["open", "regularMarketOpen"]
key_rename_map["day_high"] = ["dayHigh", "regularMarketDayHigh"]
key_rename_map["day_low"] = ["dayLow", "regularMarketDayLow"]
key_rename_map["previous_close"] = ["previousClose"]
key_rename_map["regular_market_previous_close"] = ["regularMarketPreviousClose"]
# key_rename_map["fifty_day_average"] = "fiftyDayAverage"
# key_rename_map["two_hundred_day_average"] = "twoHundredDayAverage"
# key_rename_map["year_change"] = ["52WeekChange", "fiftyTwoWeekChange"]
# key_rename_map["year_high"] = "fiftyTwoWeekHigh"
# key_rename_map["year_low"] = "fiftyTwoWeekLow"
key_rename_map["fifty_day_average"] = "fiftyDayAverage"
key_rename_map["two_hundred_day_average"] = "twoHundredDayAverage"
key_rename_map["year_change"] = ["52WeekChange", "fiftyTwoWeekChange"]
key_rename_map["year_high"] = "fiftyTwoWeekHigh"
key_rename_map["year_low"] = "fiftyTwoWeekLow"
# key_rename_map["last_volume"] = ["volume", "regularMarketVolume"]
# key_rename_map["ten_day_average_volume"] = ["averageVolume10days", "averageDailyVolume10Day"]
# key_rename_map["three_month_average_volume"] = "averageVolume"
key_rename_map["last_volume"] = ["volume", "regularMarketVolume"]
key_rename_map["ten_day_average_volume"] = ["averageVolume10days", "averageDailyVolume10Day"]
key_rename_map["three_month_average_volume"] = "averageVolume"
# key_rename_map["market_cap"] = "marketCap"
# key_rename_map["shares"] = "sharesOutstanding"
key_rename_map["market_cap"] = "marketCap"
key_rename_map["shares"] = "sharesOutstanding"
# for k in list(key_rename_map.keys()):
# if '_' in k:
# key_rename_map[yf.utils.snake_case_2_camelCase(k)] = key_rename_map[k]
for k in list(key_rename_map.keys()):
if '_' in k:
key_rename_map[yf.utils.snake_case_2_camelCase(k)] = key_rename_map[k]
# # Note: share count items in info[] are bad. Sometimes the float > outstanding!
# # So often fast_info["shares"] does not match.
# # Why isn't fast_info["shares"] wrong? Because using it to calculate market cap always correct.
# bad_keys = {"shares"}
# Note: share count items in info[] are bad. Sometimes the float > outstanding!
# So often fast_info["shares"] does not match.
# Why isn't fast_info["shares"] wrong? Because using it to calculate market cap always correct.
bad_keys = {"shares"}
# # Loose tolerance for averages, no idea why don't match info[]. Is info wrong?
# custom_tolerances = {}
# custom_tolerances["year_change"] = 1.0
# # custom_tolerances["ten_day_average_volume"] = 1e-3
# custom_tolerances["ten_day_average_volume"] = 1e-1
# # custom_tolerances["three_month_average_volume"] = 1e-2
# custom_tolerances["three_month_average_volume"] = 5e-1
# custom_tolerances["fifty_day_average"] = 1e-2
# custom_tolerances["two_hundred_day_average"] = 1e-2
# for k in list(custom_tolerances.keys()):
# if '_' in k:
# custom_tolerances[yf.utils.snake_case_2_camelCase(k)] = custom_tolerances[k]
# Loose tolerance for averages, no idea why don't match info[]. Is info wrong?
custom_tolerances = {}
custom_tolerances["year_change"] = 1.0
# custom_tolerances["ten_day_average_volume"] = 1e-3
custom_tolerances["ten_day_average_volume"] = 1e-1
# custom_tolerances["three_month_average_volume"] = 1e-2
custom_tolerances["three_month_average_volume"] = 5e-1
custom_tolerances["fifty_day_average"] = 1e-2
custom_tolerances["two_hundred_day_average"] = 1e-2
for k in list(custom_tolerances.keys()):
if '_' in k:
custom_tolerances[yf.utils.snake_case_2_camelCase(k)] = custom_tolerances[k]
# for k in fast_info_keys:
# if k in key_rename_map:
# k2 = key_rename_map[k]
# else:
# k2 = k
for k in fast_info_keys:
if k in key_rename_map:
k2 = key_rename_map[k]
else:
k2 = k
# if not isinstance(k2, list):
# k2 = [k2]
if not isinstance(k2, list):
k2 = [k2]
# for m in k2:
# for ticker in self.tickers:
# if not m in ticker.info:
# # print(f"symbol={ticker.ticker}: fast_info key '{k}' mapped to info key '{m}' but not present in info")
# continue
for m in k2:
for ticker in self.tickers:
if not m in ticker.info:
# print(f"symbol={ticker.ticker}: fast_info key '{k}' mapped to info key '{m}' but not present in info")
continue
# if k in bad_keys:
# continue
if k in bad_keys:
continue
# if k in custom_tolerances:
# rtol = custom_tolerances[k]
# else:
# rtol = 5e-3
# # rtol = 1e-4
if k in custom_tolerances:
rtol = custom_tolerances[k]
else:
rtol = 5e-3
# rtol = 1e-4
correct = ticker.info[m]
test = ticker.fast_info[k]
# print(f"Testing: symbol={ticker.ticker} m={m} k={k}: test={test} vs correct={correct}")
if k in ["market_cap","marketCap"] and ticker.fast_info["currency"] in ["GBp", "ILA"]:
# Adjust for currency to match Yahoo:
test *= 0.01
try:
if correct is None:
self.assertTrue(test is None or (not np.isnan(test)), f"{k}: {test} must be None or real value because correct={correct}")
elif isinstance(test, float) or isinstance(correct, int):
self.assertTrue(np.isclose(test, correct, rtol=rtol), f"{ticker.ticker} {k}: {test} != {correct}")
else:
self.assertEqual(test, correct, f"{k}: {test} != {correct}")
except:
if k in ["regularMarketPreviousClose"] and ticker.ticker in ["ADS.DE"]:
# Yahoo is wrong, is returning post-market close not regular
continue
else:
raise
# correct = ticker.info[m]
# test = ticker.fast_info[k]
# # print(f"Testing: symbol={ticker.ticker} m={m} k={k}: test={test} vs correct={correct}")
# if k in ["market_cap","marketCap"] and ticker.fast_info["currency"] in ["GBp", "ILA"]:
# # Adjust for currency to match Yahoo:
# test *= 0.01
# try:
# if correct is None:
# self.assertTrue(test is None or (not np.isnan(test)), f"{k}: {test} must be None or real value because correct={correct}")
# elif isinstance(test, float) or isinstance(correct, int):
# self.assertTrue(np.isclose(test, correct, rtol=rtol), f"{ticker.ticker} {k}: {test} != {correct}")
# else:
# self.assertEqual(test, correct, f"{k}: {test} != {correct}")
# except:
# if k in ["regularMarketPreviousClose"] and ticker.ticker in ["ADS.DE"]:
# # Yahoo is wrong, is returning post-market close not regular
# continue
# else:
# raise

View File

@ -23,7 +23,7 @@ from . import version
from .ticker import Ticker
from .tickers import Tickers
from .multi import download
from .utils import set_tz_cache_location
from .utils import set_tz_cache_location, enable_debug_mode
__version__ = version.version
__author__ = "Ran Aroussi"
@ -43,4 +43,4 @@ def pdr_override():
pass
__all__ = ['download', 'Ticker', 'Tickers', 'pdr_override', 'set_tz_cache_location']
__all__ = ['download', 'Ticker', 'Tickers', 'pdr_override', 'enable_debug_mode', 'set_tz_cache_location']

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,8 @@
fundamentals_keys = {}
fundamentals_keys['financials'] = ["TaxEffectOfUnusualItems","TaxRateForCalcs","NormalizedEBITDA","NormalizedDilutedEPS","NormalizedBasicEPS","TotalUnusualItems","TotalUnusualItemsExcludingGoodwill","NetIncomeFromContinuingOperationNetMinorityInterest","ReconciledDepreciation","ReconciledCostOfRevenue","EBITDA","EBIT","NetInterestIncome","InterestExpense","InterestIncome","ContinuingAndDiscontinuedDilutedEPS","ContinuingAndDiscontinuedBasicEPS","NormalizedIncome","NetIncomeFromContinuingAndDiscontinuedOperation","TotalExpenses","RentExpenseSupplemental","ReportedNormalizedDilutedEPS","ReportedNormalizedBasicEPS","TotalOperatingIncomeAsReported","DividendPerShare","DilutedAverageShares","BasicAverageShares","DilutedEPS","DilutedEPSOtherGainsLosses","TaxLossCarryforwardDilutedEPS","DilutedAccountingChange","DilutedExtraordinary","DilutedDiscontinuousOperations","DilutedContinuousOperations","BasicEPS","BasicEPSOtherGainsLosses","TaxLossCarryforwardBasicEPS","BasicAccountingChange","BasicExtraordinary","BasicDiscontinuousOperations","BasicContinuousOperations","DilutedNIAvailtoComStockholders","AverageDilutionEarnings","NetIncomeCommonStockholders","OtherunderPreferredStockDividend","PreferredStockDividends","NetIncome","MinorityInterests","NetIncomeIncludingNoncontrollingInterests","NetIncomeFromTaxLossCarryforward","NetIncomeExtraordinary","NetIncomeDiscontinuousOperations","NetIncomeContinuousOperations","EarningsFromEquityInterestNetOfTax","TaxProvision","PretaxIncome","OtherIncomeExpense","OtherNonOperatingIncomeExpenses","SpecialIncomeCharges","GainOnSaleOfPPE","GainOnSaleOfBusiness","OtherSpecialCharges","WriteOff","ImpairmentOfCapitalAssets","RestructuringAndMergernAcquisition","SecuritiesAmortization","EarningsFromEquityInterest","GainOnSaleOfSecurity","NetNonOperatingInterestIncomeExpense","TotalOtherFinanceCost","InterestExpenseNonOperating","InterestIncomeNonOperating","OperatingIncome","OperatingExpense","OtherOperatingExpenses","OtherTaxes","ProvisionForDoubtfulAccounts","DepreciationAmortizationDepletionIncomeStatement","DepletionIncomeStatement","DepreciationAndAmortizationInIncomeStatement","Amortization","AmortizationOfIntangiblesIncomeStatement","DepreciationIncomeStatement","ResearchAndDevelopment","SellingGeneralAndAdministration","SellingAndMarketingExpense","GeneralAndAdministrativeExpense","OtherGandA","InsuranceAndClaims","RentAndLandingFees","SalariesAndWages","GrossProfit","CostOfRevenue","TotalRevenue","ExciseTaxes","OperatingRevenue"]
fundamentals_keys['balance-sheet'] = ["TreasurySharesNumber","PreferredSharesNumber","OrdinarySharesNumber","ShareIssued","NetDebt","TotalDebt","TangibleBookValue","InvestedCapital","WorkingCapital","NetTangibleAssets","CapitalLeaseObligations","CommonStockEquity","PreferredStockEquity","TotalCapitalization","TotalEquityGrossMinorityInterest","MinorityInterest","StockholdersEquity","OtherEquityInterest","GainsLossesNotAffectingRetainedEarnings","OtherEquityAdjustments","FixedAssetsRevaluationReserve","ForeignCurrencyTranslationAdjustments","MinimumPensionLiabilities","UnrealizedGainLoss","TreasuryStock","RetainedEarnings","AdditionalPaidInCapital","CapitalStock","OtherCapitalStock","CommonStock","PreferredStock","TotalPartnershipCapital","GeneralPartnershipCapital","LimitedPartnershipCapital","TotalLiabilitiesNetMinorityInterest","TotalNonCurrentLiabilitiesNetMinorityInterest","OtherNonCurrentLiabilities","LiabilitiesHeldforSaleNonCurrent","RestrictedCommonStock","PreferredSecuritiesOutsideStockEquity","DerivativeProductLiabilities","EmployeeBenefits","NonCurrentPensionAndOtherPostretirementBenefitPlans","NonCurrentAccruedExpenses","DuetoRelatedPartiesNonCurrent","TradeandOtherPayablesNonCurrent","NonCurrentDeferredLiabilities","NonCurrentDeferredRevenue","NonCurrentDeferredTaxesLiabilities","LongTermDebtAndCapitalLeaseObligation","LongTermCapitalLeaseObligation","LongTermDebt","LongTermProvisions","CurrentLiabilities","OtherCurrentLiabilities","CurrentDeferredLiabilities","CurrentDeferredRevenue","CurrentDeferredTaxesLiabilities","CurrentDebtAndCapitalLeaseObligation","CurrentCapitalLeaseObligation","CurrentDebt","OtherCurrentBorrowings","LineOfCredit","CommercialPaper","CurrentNotesPayable","PensionandOtherPostRetirementBenefitPlansCurrent","CurrentProvisions","PayablesAndAccruedExpenses","CurrentAccruedExpenses","InterestPayable","Payables","OtherPayable","DuetoRelatedPartiesCurrent","DividendsPayable","TotalTaxPayable","IncomeTaxPayable","AccountsPayable","TotalAssets","TotalNonCurrentAssets","OtherNonCurrentAssets","DefinedPensionBenefit","NonCurrentPrepaidAssets","NonCurrentDeferredAssets","NonCurrentDeferredTaxesAssets","DuefromRelatedPartiesNonCurrent","NonCurrentNoteReceivables","NonCurrentAccountsReceivable","FinancialAssets","InvestmentsAndAdvances","OtherInvestments","InvestmentinFinancialAssets","HeldToMaturitySecurities","AvailableForSaleSecurities","FinancialAssetsDesignatedasFairValueThroughProfitorLossTotal","TradingSecurities","LongTermEquityInvestment","InvestmentsinJointVenturesatCost","InvestmentsInOtherVenturesUnderEquityMethod","InvestmentsinAssociatesatCost","InvestmentsinSubsidiariesatCost","InvestmentProperties","GoodwillAndOtherIntangibleAssets","OtherIntangibleAssets","Goodwill","NetPPE","AccumulatedDepreciation","GrossPPE","Leases","ConstructionInProgress","OtherProperties","MachineryFurnitureEquipment","BuildingsAndImprovements","LandAndImprovements","Properties","CurrentAssets","OtherCurrentAssets","HedgingAssetsCurrent","AssetsHeldForSaleCurrent","CurrentDeferredAssets","CurrentDeferredTaxesAssets","RestrictedCash","PrepaidAssets","Inventory","InventoriesAdjustmentsAllowances","OtherInventories","FinishedGoods","WorkInProcess","RawMaterials","Receivables","ReceivablesAdjustmentsAllowances","OtherReceivables","DuefromRelatedPartiesCurrent","TaxesReceivable","AccruedInterestReceivable","NotesReceivable","LoansReceivable","AccountsReceivable","AllowanceForDoubtfulAccountsReceivable","GrossAccountsReceivable","CashCashEquivalentsAndShortTermInvestments","OtherShortTermInvestments","CashAndCashEquivalents","CashEquivalents","CashFinancial"]
fundamentals_keys['cash-flow'] = ["ForeignSales","DomesticSales","AdjustedGeographySegmentData","FreeCashFlow","RepurchaseOfCapitalStock","RepaymentOfDebt","IssuanceOfDebt","IssuanceOfCapitalStock","CapitalExpenditure","InterestPaidSupplementalData","IncomeTaxPaidSupplementalData","EndCashPosition","OtherCashAdjustmentOutsideChangeinCash","BeginningCashPosition","EffectOfExchangeRateChanges","ChangesInCash","OtherCashAdjustmentInsideChangeinCash","CashFlowFromDiscontinuedOperation","FinancingCashFlow","CashFromDiscontinuedFinancingActivities","CashFlowFromContinuingFinancingActivities","NetOtherFinancingCharges","InterestPaidCFF","ProceedsFromStockOptionExercised","CashDividendsPaid","PreferredStockDividendPaid","CommonStockDividendPaid","NetPreferredStockIssuance","PreferredStockPayments","PreferredStockIssuance","NetCommonStockIssuance","CommonStockPayments","CommonStockIssuance","NetIssuancePaymentsOfDebt","NetShortTermDebtIssuance","ShortTermDebtPayments","ShortTermDebtIssuance","NetLongTermDebtIssuance","LongTermDebtPayments","LongTermDebtIssuance","InvestingCashFlow","CashFromDiscontinuedInvestingActivities","CashFlowFromContinuingInvestingActivities","NetOtherInvestingChanges","InterestReceivedCFI","DividendsReceivedCFI","NetInvestmentPurchaseAndSale","SaleOfInvestment","PurchaseOfInvestment","NetInvestmentPropertiesPurchaseAndSale","SaleOfInvestmentProperties","PurchaseOfInvestmentProperties","NetBusinessPurchaseAndSale","SaleOfBusiness","PurchaseOfBusiness","NetIntangiblesPurchaseAndSale","SaleOfIntangibles","PurchaseOfIntangibles","NetPPEPurchaseAndSale","SaleOfPPE","PurchaseOfPPE","CapitalExpenditureReported","OperatingCashFlow","CashFromDiscontinuedOperatingActivities","CashFlowFromContinuingOperatingActivities","TaxesRefundPaid","InterestReceivedCFO","InterestPaidCFO","DividendReceivedCFO","DividendPaidCFO","ChangeInWorkingCapital","ChangeInOtherWorkingCapital","ChangeInOtherCurrentLiabilities","ChangeInOtherCurrentAssets","ChangeInPayablesAndAccruedExpense","ChangeInAccruedExpense","ChangeInInterestPayable","ChangeInPayable","ChangeInDividendPayable","ChangeInAccountPayable","ChangeInTaxPayable","ChangeInIncomeTaxPayable","ChangeInPrepaidAssets","ChangeInInventory","ChangeInReceivables","ChangesInAccountReceivables","OtherNonCashItems","ExcessTaxBenefitFromStockBasedCompensation","StockBasedCompensation","UnrealizedGainLossOnInvestmentSecurities","ProvisionandWriteOffofAssets","AssetImpairmentCharge","AmortizationOfSecurities","DeferredTax","DeferredIncomeTax","DepreciationAmortizationDepletion","Depletion","DepreciationAndAmortization","AmortizationCashFlow","AmortizationOfIntangibles","Depreciation","OperatingGainsLosses","PensionAndEmployeeBenefitExpense","EarningsLossesFromEquityInvestments","GainLossOnInvestmentSecurities","NetForeignCurrencyExchangeGainLoss","GainLossOnSaleOfPPE","GainLossOnSaleOfBusiness","NetIncomeFromContinuingOperations","CashFlowsfromusedinOperatingActivitiesDirect","TaxesRefundPaidDirect","InterestReceivedDirect","InterestPaidDirect","DividendsReceivedDirect","DividendsPaidDirect","ClassesofCashPayments","OtherCashPaymentsfromOperatingActivities","PaymentsonBehalfofEmployees","PaymentstoSuppliersforGoodsandServices","ClassesofCashReceiptsfromOperatingActivities","OtherCashReceiptsfromOperatingActivities","ReceiptsfromGovernmentGrants","ReceiptsfromCustomers"]

View File

@ -1,27 +1,16 @@
import functools
from functools import lru_cache
import hashlib
from base64 import b64decode
usePycryptodome = False # slightly faster
# usePycryptodome = True
if usePycryptodome:
from Crypto.Cipher import AES
from Crypto.Util.Padding import unpad
else:
from cryptography.hazmat.primitives import padding
from cryptography.hazmat.primitives.ciphers import Cipher, algorithms, modes
import logging
import requests as requests
import re
from bs4 import BeautifulSoup
import random
import time
from frozendict import frozendict
try:
import ujson as json
except ImportError:
import json as json
from . import utils
cache_maxsize = 64
@ -47,127 +36,6 @@ def lru_cache_freezeargs(func):
return wrapped
def _extract_extra_keys_from_stores(data):
new_keys = [k for k in data.keys() if k not in ["context", "plugins"]]
new_keys_values = set([data[k] for k in new_keys])
# Maybe multiple keys have same value - keep one of each
new_keys_uniq = []
new_keys_uniq_values = set()
for k in new_keys:
v = data[k]
if not v in new_keys_uniq_values:
new_keys_uniq.append(k)
new_keys_uniq_values.add(v)
return [data[k] for k in new_keys_uniq]
def decrypt_cryptojs_aes_stores(data, keys=None):
encrypted_stores = data['context']['dispatcher']['stores']
password = None
if keys is not None:
if not isinstance(keys, list):
raise TypeError("'keys' must be list")
candidate_passwords = keys
else:
candidate_passwords = []
if "_cs" in data and "_cr" in data:
_cs = data["_cs"]
_cr = data["_cr"]
_cr = b"".join(int.to_bytes(i, length=4, byteorder="big", signed=True) for i in json.loads(_cr)["words"])
password = hashlib.pbkdf2_hmac("sha1", _cs.encode("utf8"), _cr, 1, dklen=32).hex()
encrypted_stores = b64decode(encrypted_stores)
assert encrypted_stores[0:8] == b"Salted__"
salt = encrypted_stores[8:16]
encrypted_stores = encrypted_stores[16:]
def _EVPKDF(password, salt, keySize=32, ivSize=16, iterations=1, hashAlgorithm="md5") -> tuple:
"""OpenSSL EVP Key Derivation Function
Args:
password (Union[str, bytes, bytearray]): Password to generate key from.
salt (Union[bytes, bytearray]): Salt to use.
keySize (int, optional): Output key length in bytes. Defaults to 32.
ivSize (int, optional): Output Initialization Vector (IV) length in bytes. Defaults to 16.
iterations (int, optional): Number of iterations to perform. Defaults to 1.
hashAlgorithm (str, optional): Hash algorithm to use for the KDF. Defaults to 'md5'.
Returns:
key, iv: Derived key and Initialization Vector (IV) bytes.
Taken from: https://gist.github.com/rafiibrahim8/0cd0f8c46896cafef6486cb1a50a16d3
OpenSSL original code: https://github.com/openssl/openssl/blob/master/crypto/evp/evp_key.c#L78
"""
assert iterations > 0, "Iterations can not be less than 1."
if isinstance(password, str):
password = password.encode("utf-8")
final_length = keySize + ivSize
key_iv = b""
block = None
while len(key_iv) < final_length:
hasher = hashlib.new(hashAlgorithm)
if block:
hasher.update(block)
hasher.update(password)
hasher.update(salt)
block = hasher.digest()
for _ in range(1, iterations):
block = hashlib.new(hashAlgorithm, block).digest()
key_iv += block
key, iv = key_iv[:keySize], key_iv[keySize:final_length]
return key, iv
def _decrypt(encrypted_stores, password, key, iv):
if usePycryptodome:
cipher = AES.new(key, AES.MODE_CBC, iv=iv)
plaintext = cipher.decrypt(encrypted_stores)
plaintext = unpad(plaintext, 16, style="pkcs7")
else:
cipher = Cipher(algorithms.AES(key), modes.CBC(iv))
decryptor = cipher.decryptor()
plaintext = decryptor.update(encrypted_stores) + decryptor.finalize()
unpadder = padding.PKCS7(128).unpadder()
plaintext = unpadder.update(plaintext) + unpadder.finalize()
plaintext = plaintext.decode("utf-8")
return plaintext
if not password is None:
try:
key, iv = _EVPKDF(password, salt, keySize=32, ivSize=16, iterations=1, hashAlgorithm="md5")
except:
raise Exception("yfinance failed to decrypt Yahoo data response")
plaintext = _decrypt(encrypted_stores, password, key, iv)
else:
success = False
for i in range(len(candidate_passwords)):
# print(f"Trying candiate pw {i+1}/{len(candidate_passwords)}")
password = candidate_passwords[i]
try:
key, iv = _EVPKDF(password, salt, keySize=32, ivSize=16, iterations=1, hashAlgorithm="md5")
plaintext = _decrypt(encrypted_stores, password, key, iv)
success = True
break
except:
pass
if not success:
raise Exception("yfinance failed to decrypt Yahoo data response")
decoded_stores = json.loads(plaintext)
return decoded_stores
_SCRAPE_URL_ = 'https://finance.yahoo.com/quote'
class TickerData:
"""
Have one place to retrieve data from Yahoo API in order to ease caching and speed up operations
@ -202,123 +70,7 @@ class TickerData:
proxy = {"https": proxy}
return proxy
def _get_decryption_keys_from_yahoo_js(self, soup):
result = None
key_count = 4
re_script = soup.find("script", string=re.compile("root.App.main")).text
re_data = json.loads(re.search("root.App.main\s+=\s+(\{.*\})", re_script).group(1))
re_data.pop("context", None)
key_list = list(re_data.keys())
if re_data.get("plugins"): # 1) attempt to get last 4 keys after plugins
ind = key_list.index("plugins")
if len(key_list) > ind+1:
sub_keys = key_list[ind+1:]
if len(sub_keys) == key_count:
re_obj = {}
missing_val = False
for k in sub_keys:
if not re_data.get(k):
missing_val = True
break
re_obj.update({k: re_data.get(k)})
if not missing_val:
result = re_obj
if not result is None:
return [''.join(result.values())]
re_keys = [] # 2) attempt scan main.js file approach to get keys
prefix = "https://s.yimg.com/uc/finance/dd-site/js/main."
tags = [tag['src'] for tag in soup.find_all('script') if prefix in tag.get('src', '')]
for t in tags:
response_js = self.cache_get(t)
#
if response_js.status_code != 200:
time.sleep(random.randrange(10, 20))
response_js.close()
else:
r_data = response_js.content.decode("utf8")
re_list = [
x.group() for x in re.finditer(r"context.dispatcher.stores=JSON.parse((?:.*?\r?\n?)*)toString", r_data)
]
for rl in re_list:
re_sublist = [x.group() for x in re.finditer(r"t\[\"((?:.*?\r?\n?)*)\"\]", rl)]
if len(re_sublist) == key_count:
re_keys = [sl.replace('t["', '').replace('"]', '') for sl in re_sublist]
break
response_js.close()
if len(re_keys) == key_count:
break
if len(re_keys) > 0:
re_obj = {}
missing_val = False
for k in re_keys:
if not re_data.get(k):
missing_val = True
break
re_obj.update({k: re_data.get(k)})
if not missing_val:
return [''.join(re_obj.values())]
return []
@lru_cache_freezeargs
@lru_cache(maxsize=cache_maxsize)
def get_json_data_stores(self, sub_page: str = None, proxy=None) -> dict:
'''
get_json_data_stores returns a python dictionary of the data stores in yahoo finance web page.
'''
if sub_page:
ticker_url = "{}/{}/{}".format(_SCRAPE_URL_, self.ticker, sub_page)
else:
ticker_url = "{}/{}".format(_SCRAPE_URL_, self.ticker)
response = self.get(url=ticker_url, proxy=proxy)
html = response.text
# The actual json-data for stores is in a javascript assignment in the webpage
try:
json_str = html.split('root.App.main =')[1].split(
'(this)')[0].split(';\n}')[0].strip()
except IndexError:
# Fetch failed, probably because Yahoo spam triggered
return {}
data = json.loads(json_str)
# Gather decryption keys:
soup = BeautifulSoup(response.content, "html.parser")
keys = self._get_decryption_keys_from_yahoo_js(soup)
if len(keys) == 0:
msg = "No decryption keys could be extracted from JS file."
if "requests_cache" in str(type(response)):
msg += " Try flushing your 'requests_cache', probably parsing old JS."
print("WARNING: " + msg + " Falling back to backup decrypt methods.")
if len(keys) == 0:
keys = []
try:
extra_keys = _extract_extra_keys_from_stores(data)
keys = [''.join(extra_keys[-4:])]
except:
pass
#
keys_url = "https://github.com/ranaroussi/yfinance/raw/main/yfinance/scrapers/yahoo-keys.txt"
response_gh = self.cache_get(keys_url)
keys += response_gh.text.splitlines()
# Decrypt!
stores = decrypt_cryptojs_aes_stores(data, keys)
if stores is None:
# Maybe Yahoo returned old format, not encrypted
if "context" in data and "dispatcher" in data["context"]:
stores = data['context']['dispatcher']['stores']
if stores is None:
raise Exception(f"{self.ticker}: Failed to extract data stores from web request")
# return data
new_data = json.dumps(stores).replace('{}', 'null')
new_data = re.sub(
r'{[\'|\"]raw[\'|\"]:(.*?),(.*?)}', r'\1', new_data)
return json.loads(new_data)
def get_raw_json(self, url, user_agent_headers=None, params=None, proxy=None, timeout=30):
response = self.get(url, user_agent_headers=user_agent_headers, params=params, proxy=proxy, timeout=timeout)
response.raise_for_status()
return response.json()

View File

@ -4,3 +4,9 @@ class YFinanceException(Exception):
class YFinanceDataException(YFinanceException):
pass
class YFNotImplementedError(NotImplementedError):
def __init__(self, method_name):
super().__init__(f"Have not implemented fetching '{method_name}' from Yahoo API")

View File

@ -21,6 +21,8 @@
from __future__ import print_function
import logging
import traceback
import time as _time
import multitasking as _multitasking
import pandas as _pd
@ -28,11 +30,11 @@ import pandas as _pd
from . import Ticker, utils
from . import shared
@utils.log_indent_decorator
def download(tickers, start=None, end=None, actions=False, threads=True, ignore_tz=None,
group_by='column', auto_adjust=False, back_adjust=False, repair=False, keepna=False,
progress=True, period="max", show_errors=True, interval="1d", prepost=False,
proxy=None, rounding=False, timeout=10):
progress=True, period="max", show_errors=None, interval="1d", prepost=False,
proxy=None, rounding=False, timeout=10, session=None):
"""Download yahoo tickers
:Parameters:
tickers : str, list
@ -44,11 +46,13 @@ def download(tickers, start=None, end=None, actions=False, threads=True, ignore_
Valid intervals: 1m,2m,5m,15m,30m,60m,90m,1h,1d,5d,1wk,1mo,3mo
Intraday data cannot extend last 60 days
start: str
Download start date string (YYYY-MM-DD) or _datetime.
Download start date string (YYYY-MM-DD) or _datetime, inclusive.
Default is 1900-01-01
E.g. for start="2020-01-01", the first data point will be on "2020-01-01"
end: str
Download end date string (YYYY-MM-DD) or _datetime.
Download end date string (YYYY-MM-DD) or _datetime, exclusive.
Default is now
E.g. for end="2023-01-01", the last data point will be on "2022-12-31"
group_by : str
Group by 'ticker' or 'column' (default)
prepost : bool
@ -75,10 +79,33 @@ def download(tickers, start=None, end=None, actions=False, threads=True, ignore_
Optional. Round values to 2 decimal places?
show_errors: bool
Optional. Doesn't print errors if False
DEPRECATED, will be removed in future version
timeout: None or float
If not None stops waiting for a response after given number of
seconds. (Can also be a fraction of a second e.g. 0.01)
session: None or Session
Optional. Pass your own session object to be used for all requests
"""
logger = utils.get_yf_logger()
if show_errors is not None:
if show_errors:
utils.print_once(f"yfinance: download(show_errors={show_errors}) argument is deprecated and will be removed in future version. Do this instead: logging.getLogger('yfinance').setLevel(logging.ERROR)")
logger.setLevel(logging.ERROR)
else:
utils.print_once(f"yfinance: download(show_errors={show_errors}) argument is deprecated and will be removed in future version. Do this instead to suppress error messages: logging.getLogger('yfinance').setLevel(logging.CRITICAL)")
logger.setLevel(logging.CRITICAL)
if logger.isEnabledFor(logging.DEBUG):
if threads:
# With DEBUG, each thread generates a lot of log messages.
# And with multi-threading, these messages will be interleaved, bad!
# So disable multi-threading to make log readable.
logger.debug('Disabling multithreading because DEBUG logging enabled')
threads = False
if progress:
# Disable progress bar, interferes with display of log messages
progress = False
if ignore_tz is None:
# Set default value depending on interval
@ -98,7 +125,7 @@ def download(tickers, start=None, end=None, actions=False, threads=True, ignore_
for ticker in tickers:
if utils.is_isin(ticker):
isin = ticker
ticker = utils.get_ticker_by_isin(ticker, proxy)
ticker = utils.get_ticker_by_isin(ticker, proxy, session=session)
shared._ISINS[ticker] = isin
_tickers_.append(ticker)
@ -112,6 +139,7 @@ def download(tickers, start=None, end=None, actions=False, threads=True, ignore_
# reset shared._DFS
shared._DFS = {}
shared._ERRORS = {}
shared._TRACEBACKS = {}
# download using threads
if threads:
@ -124,10 +152,9 @@ def download(tickers, start=None, end=None, actions=False, threads=True, ignore_
actions=actions, auto_adjust=auto_adjust,
back_adjust=back_adjust, repair=repair, keepna=keepna,
progress=(progress and i > 0), proxy=proxy,
rounding=rounding, timeout=timeout)
rounding=rounding, timeout=timeout, session=session)
while len(shared._DFS) < len(tickers):
_time.sleep(0.01)
# download synchronously
else:
for i, ticker in enumerate(tickers):
@ -136,20 +163,42 @@ def download(tickers, start=None, end=None, actions=False, threads=True, ignore_
actions=actions, auto_adjust=auto_adjust,
back_adjust=back_adjust, repair=repair, keepna=keepna,
proxy=proxy,
rounding=rounding, timeout=timeout)
shared._DFS[ticker.upper()] = data
rounding=rounding, timeout=timeout, session=session)
if progress:
shared._PROGRESS_BAR.animate()
if progress:
shared._PROGRESS_BAR.completed()
if shared._ERRORS and show_errors:
print('\n%.f Failed download%s:' % (
if shared._ERRORS:
# Send errors to logging module
logger = utils.get_yf_logger()
logger.error('\n%.f Failed download%s:' % (
len(shared._ERRORS), 's' if len(shared._ERRORS) > 1 else ''))
# print(shared._ERRORS)
print("\n".join(['- %s: %s' %
v for v in list(shared._ERRORS.items())]))
# Log each distinct error once, with list of symbols affected
errors = {}
for ticker in shared._ERRORS:
err = shared._ERRORS[ticker]
err = err.replace(f'{ticker}', '%ticker%')
if not err in errors:
errors[err] = [ticker]
else:
errors[err].append(ticker)
for err in errors.keys():
logger.error(f'{errors[err]}: ' + err)
# Log each distinct traceback once, with list of symbols affected
tbs = {}
for ticker in shared._TRACEBACKS:
tb = shared._TRACEBACKS[ticker]
tb = tb.replace(f'{ticker}', '%ticker%')
if not tb in tbs:
tbs[tb] = [ticker]
else:
tbs[tb].append(ticker)
for tb in tbs.keys():
logger.debug(f'{tbs[tb]}: ' + tb)
if ignore_tz:
for tkr in shared._DFS.keys():
@ -206,17 +255,10 @@ def _download_one_threaded(ticker, start=None, end=None,
auto_adjust=False, back_adjust=False, repair=False,
actions=False, progress=True, period="max",
interval="1d", prepost=False, proxy=None,
keepna=False, rounding=False, timeout=10):
try:
data = _download_one(ticker, start, end, auto_adjust, back_adjust, repair,
actions, period, interval, prepost, proxy, rounding,
keepna, timeout)
except Exception as e:
# glob try/except needed as current thead implementation breaks if exception is raised.
shared._DFS[ticker] = utils.empty_df()
shared._ERRORS[ticker] = repr(e)
else:
shared._DFS[ticker.upper()] = data
keepna=False, rounding=False, timeout=10, session=None):
data = _download_one(ticker, start, end, auto_adjust, back_adjust, repair,
actions, period, interval, prepost, proxy, rounding,
keepna, timeout, session)
if progress:
shared._PROGRESS_BAR.animate()
@ -225,12 +267,23 @@ def _download_one(ticker, start=None, end=None,
auto_adjust=False, back_adjust=False, repair=False,
actions=False, period="max", interval="1d",
prepost=False, proxy=None, rounding=False,
keepna=False, timeout=10):
return Ticker(ticker).history(
period=period, interval=interval,
start=start, end=end, prepost=prepost,
actions=actions, auto_adjust=auto_adjust,
back_adjust=back_adjust, repair=repair, proxy=proxy,
rounding=rounding, keepna=keepna, timeout=timeout,
debug=False, raise_errors=False # debug and raise_errors false to not log and raise errors in threads
)
keepna=False, timeout=10, session=None):
data = None
try:
data = Ticker(ticker, session=session).history(
period=period, interval=interval,
start=start, end=end, prepost=prepost,
actions=actions, auto_adjust=auto_adjust,
back_adjust=back_adjust, repair=repair, proxy=proxy,
rounding=rounding, keepna=keepna, timeout=timeout,
raise_errors=True
)
except Exception as e:
# glob try/except needed as current thead implementation breaks if exception is raised.
shared._DFS[ticker.upper()] = utils.empty_df()
shared._ERRORS[ticker.upper()] = repr(e)
shared._TRACEBACKS[ticker.upper()] = traceback.format_exc()
else:
shared._DFS[ticker.upper()] = data
return data

View File

@ -2,6 +2,7 @@ import pandas as pd
from yfinance import utils
from yfinance.data import TickerData
from yfinance.exceptions import YFNotImplementedError
class Analysis:
@ -20,99 +21,29 @@ class Analysis:
@property
def earnings_trend(self) -> pd.DataFrame:
if self._earnings_trend is None:
self._scrape(self.proxy)
raise YFNotImplementedError('earnings_trend')
return self._earnings_trend
@property
def analyst_trend_details(self) -> pd.DataFrame:
if self._analyst_trend_details is None:
self._scrape(self.proxy)
raise YFNotImplementedError('analyst_trend_details')
return self._analyst_trend_details
@property
def analyst_price_target(self) -> pd.DataFrame:
if self._analyst_price_target is None:
self._scrape(self.proxy)
raise YFNotImplementedError('analyst_price_target')
return self._analyst_price_target
@property
def rev_est(self) -> pd.DataFrame:
if self._rev_est is None:
self._scrape(self.proxy)
raise YFNotImplementedError('rev_est')
return self._rev_est
@property
def eps_est(self) -> pd.DataFrame:
if self._eps_est is None:
self._scrape(self.proxy)
raise YFNotImplementedError('eps_est')
return self._eps_est
def _scrape(self, proxy):
if self._already_scraped:
return
self._already_scraped = True
# Analysis Data/Analyst Forecasts
analysis_data = self._data.get_json_data_stores("analysis", proxy=proxy)
try:
analysis_data = analysis_data['QuoteSummaryStore']
except KeyError as e:
err_msg = "No analysis data found, symbol may be delisted"
print('- %s: %s' % (self._data.ticker, err_msg))
return
if isinstance(analysis_data.get('earningsTrend'), dict):
try:
analysis = pd.DataFrame(analysis_data['earningsTrend']['trend'])
analysis['endDate'] = pd.to_datetime(analysis['endDate'])
analysis.set_index('period', inplace=True)
analysis.index = analysis.index.str.upper()
analysis.index.name = 'Period'
analysis.columns = utils.camel2title(analysis.columns)
dict_cols = []
for idx, row in analysis.iterrows():
for colname, colval in row.items():
if isinstance(colval, dict):
dict_cols.append(colname)
for k, v in colval.items():
new_colname = colname + ' ' + \
utils.camel2title([k])[0]
analysis.loc[idx, new_colname] = v
self._earnings_trend = analysis[[
c for c in analysis.columns if c not in dict_cols]]
except Exception:
pass
try:
self._analyst_trend_details = pd.DataFrame(analysis_data['recommendationTrend']['trend'])
except Exception as e:
self._analyst_trend_details = None
try:
self._analyst_price_target = pd.DataFrame(analysis_data['financialData'], index=[0])[
['targetLowPrice', 'currentPrice', 'targetMeanPrice', 'targetHighPrice', 'numberOfAnalystOpinions']].T
except Exception as e:
self._analyst_price_target = None
earnings_estimate = []
revenue_estimate = []
if self._analyst_trend_details is not None :
for key in analysis_data['earningsTrend']['trend']:
try:
earnings_dict = key['earningsEstimate']
earnings_dict['period'] = key['period']
earnings_dict['endDate'] = key['endDate']
earnings_estimate.append(earnings_dict)
revenue_dict = key['revenueEstimate']
revenue_dict['period'] = key['period']
revenue_dict['endDate'] = key['endDate']
revenue_estimate.append(revenue_dict)
except Exception as e:
pass
self._rev_est = pd.DataFrame(revenue_estimate)
self._eps_est = pd.DataFrame(earnings_estimate)
else:
self._rev_est = pd.DataFrame()
self._eps_est = pd.DataFrame()

View File

@ -1,13 +1,13 @@
import datetime
import logging
import json
import pandas as pd
import numpy as np
from yfinance import utils
from yfinance import utils, const
from yfinance.data import TickerData
from yfinance.exceptions import YFinanceDataException, YFinanceException
from yfinance.exceptions import YFinanceException, YFNotImplementedError
class Fundamentals:
@ -31,71 +31,15 @@ class Fundamentals:
@property
def earnings(self) -> dict:
if self._earnings is None:
self._scrape_earnings(self.proxy)
raise YFNotImplementedError('earnings')
return self._earnings
@property
def shares(self) -> pd.DataFrame:
if self._shares is None:
self._scrape_shares(self.proxy)
raise YFNotImplementedError('shares')
return self._shares
def _scrape_basics(self, proxy):
if self._basics_already_scraped:
return
self._basics_already_scraped = True
self._financials_data = self._data.get_json_data_stores('financials', proxy)
try:
self._fin_data_quote = self._financials_data['QuoteSummaryStore']
except KeyError:
err_msg = "No financials data found, symbol may be delisted"
print('- %s: %s' % (self._data.ticker, err_msg))
return None
def _scrape_earnings(self, proxy):
self._scrape_basics(proxy)
# earnings
self._earnings = {"yearly": pd.DataFrame(), "quarterly": pd.DataFrame()}
if self._fin_data_quote is None:
return
if isinstance(self._fin_data_quote.get('earnings'), dict):
try:
earnings = self._fin_data_quote['earnings']['financialsChart']
earnings['financialCurrency'] = self._fin_data_quote['earnings'].get('financialCurrency', 'USD')
self._earnings['financialCurrency'] = earnings['financialCurrency']
df = pd.DataFrame(earnings['yearly']).set_index('date')
df.columns = utils.camel2title(df.columns)
df.index.name = 'Year'
self._earnings['yearly'] = df
df = pd.DataFrame(earnings['quarterly']).set_index('date')
df.columns = utils.camel2title(df.columns)
df.index.name = 'Quarter'
self._earnings['quarterly'] = df
except Exception:
pass
def _scrape_shares(self, proxy):
self._scrape_basics(proxy)
# shares outstanding
try:
# keep only years with non None data
available_shares = [shares_data for shares_data in
self._financials_data['QuoteTimeSeriesStore']['timeSeries']['annualBasicAverageShares']
if
shares_data]
shares = pd.DataFrame(available_shares)
shares['Year'] = shares['asOfDate'].agg(lambda x: int(x[:4]))
shares.set_index('Year', inplace=True)
shares.drop(columns=['dataId', 'asOfDate',
'periodType', 'currencyCode'], inplace=True)
shares.rename(
columns={'reportedValue': "BasicShares"}, inplace=True)
self._shares = shares
except Exception:
pass
class Financials:
def __init__(self, data: TickerData):
@ -103,9 +47,6 @@ class Financials:
self._income_time_series = {}
self._balance_sheet_time_series = {}
self._cash_flow_time_series = {}
self._income_scraped = {}
self._balance_sheet_scraped = {}
self._cash_flow_scraped = {}
def get_income_time_series(self, freq="yearly", proxy=None) -> pd.DataFrame:
res = self._income_time_series
@ -125,6 +66,7 @@ class Financials:
res[freq] = self._fetch_time_series("cash-flow", freq, proxy)
return res[freq]
@utils.log_indent_decorator
def _fetch_time_series(self, name, timescale, proxy=None):
# Fetching time series preferred over scraping 'QuoteSummaryStore',
# because it matches what Yahoo shows. But for some tickers returns nothing,
@ -144,7 +86,7 @@ class Financials:
if statement is not None:
return statement
except YFinanceException as e:
print(f"- {self._data.ticker}: Failed to create {name} financials table for reason: {repr(e)}")
utils.get_yf_logger().error("%s: Failed to create %s financials table for reason: %r", self._data.ticker, name, e)
return pd.DataFrame()
def _create_financials_table(self, name, timescale, proxy):
@ -152,37 +94,13 @@ class Financials:
# Yahoo stores the 'income' table internally under 'financials' key
name = "financials"
keys = self._get_datastore_keys(name, proxy)
keys = const.fundamentals_keys[name]
try:
return self.get_financials_time_series(timescale, keys, proxy)
except Exception as e:
pass
def _get_datastore_keys(self, sub_page, proxy) -> list:
data_stores = self._data.get_json_data_stores(sub_page, proxy)
# Step 1: get the keys:
def _finditem1(key, obj):
values = []
if isinstance(obj, dict):
if key in obj.keys():
values.append(obj[key])
for k, v in obj.items():
values += _finditem1(key, v)
elif isinstance(obj, list):
for v in obj:
values += _finditem1(key, v)
return values
try:
keys = _finditem1("key", data_stores['FinancialTemplateStore'])
except KeyError as e:
raise YFinanceDataException("Parsing FinancialTemplateStore failed, reason: {}".format(repr(e)))
if not keys:
raise YFinanceDataException("No keys in FinancialTemplateStore")
return keys
def get_financials_time_series(self, timescale, keys: list, proxy=None) -> pd.DataFrame:
timescale_translation = {"yearly": "annual", "quarterly": "quarterly"}
timescale = timescale_translation[timescale]
@ -231,89 +149,3 @@ class Financials:
df = df[sorted(df.columns, reverse=True)]
return df
def get_income_scrape(self, freq="yearly", proxy=None) -> pd.DataFrame:
res = self._income_scraped
if freq not in res:
res[freq] = self._scrape("income", freq, proxy)
return res[freq]
def get_balance_sheet_scrape(self, freq="yearly", proxy=None) -> pd.DataFrame:
res = self._balance_sheet_scraped
if freq not in res:
res[freq] = self._scrape("balance-sheet", freq, proxy)
return res[freq]
def get_cash_flow_scrape(self, freq="yearly", proxy=None) -> pd.DataFrame:
res = self._cash_flow_scraped
if freq not in res:
res[freq] = self._scrape("cash-flow", freq, proxy)
return res[freq]
def _scrape(self, name, timescale, proxy=None):
# Backup in case _fetch_time_series() fails to return data
allowed_names = ["income", "balance-sheet", "cash-flow"]
allowed_timescales = ["yearly", "quarterly"]
if name not in allowed_names:
raise ValueError("Illegal argument: name must be one of: {}".format(allowed_names))
if timescale not in allowed_timescales:
raise ValueError("Illegal argument: timescale must be one of: {}".format(allowed_names))
try:
statement = self._create_financials_table_old(name, timescale, proxy)
if statement is not None:
return statement
except YFinanceException as e:
print(f"- {self._data.ticker}: Failed to create financials table for {name} reason: {repr(e)}")
return pd.DataFrame()
def _create_financials_table_old(self, name, timescale, proxy):
data_stores = self._data.get_json_data_stores("financials", proxy)
# Fetch raw data
if not "QuoteSummaryStore" in data_stores:
raise YFinanceDataException(f"Yahoo not returning legacy financials data")
data = data_stores["QuoteSummaryStore"]
if name == "cash-flow":
key1 = "cashflowStatement"
key2 = "cashflowStatements"
elif name == "balance-sheet":
key1 = "balanceSheet"
key2 = "balanceSheetStatements"
else:
key1 = "incomeStatement"
key2 = "incomeStatementHistory"
key1 += "History"
if timescale == "quarterly":
key1 += "Quarterly"
if key1 not in data or data[key1] is None or key2 not in data[key1]:
raise YFinanceDataException(f"Yahoo not returning legacy {name} financials data")
data = data[key1][key2]
# Tabulate
df = pd.DataFrame(data)
if len(df) == 0:
raise YFinanceDataException(f"Yahoo not returning legacy {name} financials data")
df = df.drop(columns=['maxAge'])
for col in df.columns:
df[col] = df[col].replace('-', np.nan)
df.set_index('endDate', inplace=True)
try:
df.index = pd.to_datetime(df.index, unit='s')
except ValueError:
df.index = pd.to_datetime(df.index)
df = df.T
df.columns.name = ''
df.index.name = 'Breakdown'
# rename incorrect yahoo key
df.rename(index={'treasuryStock': 'gainsLossesNotAffectingRetainedEarnings'}, inplace=True)
# Upper-case first letter, leave rest unchanged:
s0 = df.index[0]
df.index = [s[0].upper()+s[1:] for s in df.index]
return df

View File

@ -1,11 +1,14 @@
import datetime
import logging
import json
import warnings
import pandas as pd
import numpy as _np
from yfinance import utils
from yfinance.data import TickerData
from yfinance.exceptions import YFNotImplementedError
info_retired_keys_price = {"currentPrice", "dayHigh", "dayLow", "open", "previousClose", "volume", "volume24Hr"}
info_retired_keys_price.update({"regularMarket"+s for s in ["DayHigh", "DayLow", "Open", "PreviousClose", "Price", "Volume"]})
@ -17,9 +20,7 @@ info_retired_keys_symbol = {"symbol"}
info_retired_keys = info_retired_keys_price | info_retired_keys_exchange | info_retired_keys_marketCap | info_retired_keys_symbol
PRUNE_INFO = True
# PRUNE_INFO = False
_BASIC_URL_ = "https://query2.finance.yahoo.com/v6/finance/quoteSummary"
from collections.abc import MutableMapping
class InfoDictWrapper(MutableMapping):
@ -44,16 +45,16 @@ class InfoDictWrapper(MutableMapping):
def __getitem__(self, k):
if k in info_retired_keys_price:
print(f"Price data removed from info (key='{k}'). Use Ticker.fast_info or history() instead")
warnings.warn(f"Price data removed from info (key='{k}'). Use Ticker.fast_info or history() instead", DeprecationWarning)
return None
elif k in info_retired_keys_exchange:
print(f"Exchange data removed from info (key='{k}'). Use Ticker.fast_info or Ticker.get_history_metadata() instead")
warnings.warn(f"Exchange data removed from info (key='{k}'). Use Ticker.fast_info or Ticker.get_history_metadata() instead", DeprecationWarning)
return None
elif k in info_retired_keys_marketCap:
print(f"Market cap removed from info (key='{k}'). Use Ticker.fast_info instead")
warnings.warn(f"Market cap removed from info (key='{k}'). Use Ticker.fast_info instead", DeprecationWarning)
return None
elif k in info_retired_keys_symbol:
print(f"Symbol removed from info (key='{k}'). You know this already")
warnings.warn(f"Symbol removed from info (key='{k}'). You know this already", DeprecationWarning)
return None
return self.info[self._keytransform(k)]
@ -73,6 +74,471 @@ class InfoDictWrapper(MutableMapping):
return k
class FastInfo:
# Contain small subset of info[] items that can be fetched faster elsewhere.
# Imitates a dict.
def __init__(self, tickerBaseObject):
self._tkr = tickerBaseObject
self._prices_1y = None
self._prices_1wk_1h_prepost = None
self._prices_1wk_1h_reg = None
self._md = None
self._currency = None
self._quote_type = None
self._exchange = None
self._timezone = None
self._shares = None
self._mcap = None
self._open = None
self._day_high = None
self._day_low = None
self._last_price = None
self._last_volume = None
self._prev_close = None
self._reg_prev_close = None
self._50d_day_average = None
self._200d_day_average = None
self._year_high = None
self._year_low = None
self._year_change = None
self._10d_avg_vol = None
self._3mo_avg_vol = None
# attrs = utils.attributes(self)
# self.keys = attrs.keys()
# utils.attributes is calling each method, bad! Have to hardcode
_properties = ["currency", "quote_type", "exchange", "timezone"]
_properties += ["shares", "market_cap"]
_properties += ["last_price", "previous_close", "open", "day_high", "day_low"]
_properties += ["regular_market_previous_close"]
_properties += ["last_volume"]
_properties += ["fifty_day_average", "two_hundred_day_average", "ten_day_average_volume", "three_month_average_volume"]
_properties += ["year_high", "year_low", "year_change"]
# Because released before fixing key case, need to officially support
# camel-case but also secretly support snake-case
base_keys = [k for k in _properties if not '_' in k]
sc_keys = [k for k in _properties if '_' in k]
self._sc_to_cc_key = {k:utils.snake_case_2_camelCase(k) for k in sc_keys}
self._cc_to_sc_key = {v:k for k,v in self._sc_to_cc_key.items()}
self._public_keys = sorted(base_keys + list(self._sc_to_cc_key.values()))
self._keys = sorted(self._public_keys + sc_keys)
# dict imitation:
def keys(self):
return self._public_keys
def items(self):
return [(k,self[k]) for k in self._public_keys]
def values(self):
return [self[k] for k in self._public_keys]
def get(self, key, default=None):
if key in self.keys():
if key in self._cc_to_sc_key:
key = self._cc_to_sc_key[key]
return self[key]
return default
def __getitem__(self, k):
if not isinstance(k, str):
raise KeyError(f"key must be a string")
if not k in self._keys:
raise KeyError(f"'{k}' not valid key. Examine 'FastInfo.keys()'")
if k in self._cc_to_sc_key:
k = self._cc_to_sc_key[k]
return getattr(self, k)
def __contains__(self, k):
return k in self.keys()
def __iter__(self):
return iter(self.keys())
def __str__(self):
return "lazy-loading dict with keys = " + str(self.keys())
def __repr__(self):
return self.__str__()
def toJSON(self, indent=4):
d = {k:self[k] for k in self.keys()}
return _json.dumps({k:self[k] for k in self.keys()}, indent=indent)
def _get_1y_prices(self, fullDaysOnly=False):
if self._prices_1y is None:
# Temporarily disable error printing
logging.disable(logging.CRITICAL)
self._prices_1y = self._tkr.history(period="380d", auto_adjust=False, keepna=True)
logging.disable(logging.NOTSET)
self._md = self._tkr.get_history_metadata()
try:
ctp = self._md["currentTradingPeriod"]
self._today_open = pd.to_datetime(ctp["regular"]["start"], unit='s', utc=True).tz_convert(self.timezone)
self._today_close = pd.to_datetime(ctp["regular"]["end"], unit='s', utc=True).tz_convert(self.timezone)
self._today_midnight = self._today_close.ceil("D")
except:
self._today_open = None
self._today_close = None
self._today_midnight = None
raise
if self._prices_1y.empty:
return self._prices_1y
dnow = pd.Timestamp.utcnow().tz_convert(self.timezone).date()
d1 = dnow
d0 = (d1 + datetime.timedelta(days=1)) - utils._interval_to_timedelta("1y")
if fullDaysOnly and self._exchange_open_now():
# Exclude today
d1 -= utils._interval_to_timedelta("1d")
return self._prices_1y.loc[str(d0):str(d1)]
def _get_1wk_1h_prepost_prices(self):
if self._prices_1wk_1h_prepost is None:
# Temporarily disable error printing
logging.disable(logging.CRITICAL)
self._prices_1wk_1h_prepost = self._tkr.history(period="1wk", interval="1h", auto_adjust=False, prepost=True)
logging.disable(logging.NOTSET)
return self._prices_1wk_1h_prepost
def _get_1wk_1h_reg_prices(self):
if self._prices_1wk_1h_reg is None:
# Temporarily disable error printing
logging.disable(logging.CRITICAL)
self._prices_1wk_1h_reg = self._tkr.history(period="1wk", interval="1h", auto_adjust=False, prepost=False)
logging.disable(logging.NOTSET)
return self._prices_1wk_1h_reg
def _get_exchange_metadata(self):
if self._md is not None:
return self._md
self._get_1y_prices()
self._md = self._tkr.get_history_metadata()
return self._md
def _exchange_open_now(self):
t = pd.Timestamp.utcnow()
self._get_exchange_metadata()
# if self._today_open is None and self._today_close is None:
# r = False
# else:
# r = self._today_open <= t and t < self._today_close
# if self._today_midnight is None:
# r = False
# elif self._today_midnight.date() > t.tz_convert(self.timezone).date():
# r = False
# else:
# r = t < self._today_midnight
last_day_cutoff = self._get_1y_prices().index[-1] + datetime.timedelta(days=1)
last_day_cutoff += datetime.timedelta(minutes=20)
r = t < last_day_cutoff
# print("_exchange_open_now() returning", r)
return r
@property
def currency(self):
if self._currency is not None:
return self._currency
if self._tkr._history_metadata is None:
self._get_1y_prices()
md = self._tkr.get_history_metadata()
self._currency = md["currency"]
return self._currency
@property
def quote_type(self):
if self._quote_type is not None:
return self._quote_type
if self._tkr._history_metadata is None:
self._get_1y_prices()
md = self._tkr.get_history_metadata()
self._quote_type = md["instrumentType"]
return self._quote_type
@property
def exchange(self):
if self._exchange is not None:
return self._exchange
self._exchange = self._get_exchange_metadata()["exchangeName"]
return self._exchange
@property
def timezone(self):
if self._timezone is not None:
return self._timezone
self._timezone = self._get_exchange_metadata()["exchangeTimezoneName"]
return self._timezone
@property
def shares(self):
if self._shares is not None:
return self._shares
shares = self._tkr.get_shares_full(start=pd.Timestamp.utcnow().date()-pd.Timedelta(days=548))
# if shares is None:
# # Requesting 18 months failed, so fallback to shares which should include last year
# shares = self._tkr.get_shares()
if shares is not None:
if isinstance(shares, pd.DataFrame):
shares = shares[shares.columns[0]]
self._shares = int(shares.iloc[-1])
return self._shares
@property
def last_price(self):
if self._last_price is not None:
return self._last_price
prices = self._get_1y_prices()
if prices.empty:
md = self._get_exchange_metadata()
if "regularMarketPrice" in md:
self._last_price = md["regularMarketPrice"]
else:
self._last_price = float(prices["Close"].iloc[-1])
if _np.isnan(self._last_price):
md = self._get_exchange_metadata()
if "regularMarketPrice" in md:
self._last_price = md["regularMarketPrice"]
return self._last_price
@property
def previous_close(self):
if self._prev_close is not None:
return self._prev_close
prices = self._get_1wk_1h_prepost_prices()
fail = False
if prices.empty:
fail = True
else:
prices = prices[["Close"]].groupby(prices.index.date).last()
if prices.shape[0] < 2:
# Very few symbols have previousClose despite no
# no trading data e.g. 'QCSTIX'.
fail = True
else:
self._prev_close = float(prices["Close"].iloc[-2])
if fail:
# Fallback to original info[] if available.
self._tkr.info # trigger fetch
k = "previousClose"
if self._tkr._quote._retired_info is not None and k in self._tkr._quote._retired_info:
self._prev_close = self._tkr._quote._retired_info[k]
return self._prev_close
@property
def regular_market_previous_close(self):
if self._reg_prev_close is not None:
return self._reg_prev_close
prices = self._get_1y_prices()
if prices.shape[0] == 1:
# Tiny % of tickers don't return daily history before last trading day,
# so backup option is hourly history:
prices = self._get_1wk_1h_reg_prices()
prices = prices[["Close"]].groupby(prices.index.date).last()
if prices.shape[0] < 2:
# Very few symbols have regularMarketPreviousClose despite no
# no trading data. E.g. 'QCSTIX'.
# So fallback to original info[] if available.
self._tkr.info # trigger fetch
k = "regularMarketPreviousClose"
if self._tkr._quote._retired_info is not None and k in self._tkr._quote._retired_info:
self._reg_prev_close = self._tkr._quote._retired_info[k]
else:
self._reg_prev_close = float(prices["Close"].iloc[-2])
return self._reg_prev_close
@property
def open(self):
if self._open is not None:
return self._open
prices = self._get_1y_prices()
if prices.empty:
self._open = None
else:
self._open = float(prices["Open"].iloc[-1])
if _np.isnan(self._open):
self._open = None
return self._open
@property
def day_high(self):
if self._day_high is not None:
return self._day_high
prices = self._get_1y_prices()
if prices.empty:
self._day_high = None
else:
self._day_high = float(prices["High"].iloc[-1])
if _np.isnan(self._day_high):
self._day_high = None
return self._day_high
@property
def day_low(self):
if self._day_low is not None:
return self._day_low
prices = self._get_1y_prices()
if prices.empty:
self._day_low = None
else:
self._day_low = float(prices["Low"].iloc[-1])
if _np.isnan(self._day_low):
self._day_low = None
return self._day_low
@property
def last_volume(self):
if self._last_volume is not None:
return self._last_volume
prices = self._get_1y_prices()
self._last_volume = None if prices.empty else int(prices["Volume"].iloc[-1])
return self._last_volume
@property
def fifty_day_average(self):
if self._50d_day_average is not None:
return self._50d_day_average
prices = self._get_1y_prices(fullDaysOnly=True)
if prices.empty:
self._50d_day_average = None
else:
n = prices.shape[0]
a = n-50
b = n
if a < 0:
a = 0
self._50d_day_average = float(prices["Close"].iloc[a:b].mean())
return self._50d_day_average
@property
def two_hundred_day_average(self):
if self._200d_day_average is not None:
return self._200d_day_average
prices = self._get_1y_prices(fullDaysOnly=True)
if prices.empty:
self._200d_day_average = None
else:
n = prices.shape[0]
a = n-200
b = n
if a < 0:
a = 0
self._200d_day_average = float(prices["Close"].iloc[a:b].mean())
return self._200d_day_average
@property
def ten_day_average_volume(self):
if self._10d_avg_vol is not None:
return self._10d_avg_vol
prices = self._get_1y_prices(fullDaysOnly=True)
if prices.empty:
self._10d_avg_vol = None
else:
n = prices.shape[0]
a = n-10
b = n
if a < 0:
a = 0
self._10d_avg_vol = int(prices["Volume"].iloc[a:b].mean())
return self._10d_avg_vol
@property
def three_month_average_volume(self):
if self._3mo_avg_vol is not None:
return self._3mo_avg_vol
prices = self._get_1y_prices(fullDaysOnly=True)
if prices.empty:
self._3mo_avg_vol = None
else:
dt1 = prices.index[-1]
dt0 = dt1 - utils._interval_to_timedelta("3mo") + utils._interval_to_timedelta("1d")
self._3mo_avg_vol = int(prices.loc[dt0:dt1, "Volume"].mean())
return self._3mo_avg_vol
@property
def year_high(self):
if self._year_high is not None:
return self._year_high
prices = self._get_1y_prices(fullDaysOnly=True)
if prices.empty:
prices = self._get_1y_prices(fullDaysOnly=False)
self._year_high = float(prices["High"].max())
return self._year_high
@property
def year_low(self):
if self._year_low is not None:
return self._year_low
prices = self._get_1y_prices(fullDaysOnly=True)
if prices.empty:
prices = self._get_1y_prices(fullDaysOnly=False)
self._year_low = float(prices["Low"].min())
return self._year_low
@property
def year_change(self):
if self._year_change is not None:
return self._year_change
prices = self._get_1y_prices(fullDaysOnly=True)
if prices.shape[0] >= 2:
self._year_change = (prices["Close"].iloc[-1] - prices["Close"].iloc[0]) / prices["Close"].iloc[0]
self._year_change = float(self._year_change)
return self._year_change
@property
def market_cap(self):
if self._mcap is not None:
return self._mcap
try:
shares = self.shares
except Exception as e:
if "Cannot retrieve share count" in str(e):
shares = None
elif "failed to decrypt Yahoo" in str(e):
shares = None
else:
raise
if shares is None:
# Very few symbols have marketCap despite no share count.
# E.g. 'BTC-USD'
# So fallback to original info[] if available.
self._tkr.info
k = "marketCap"
if self._tkr._quote._retired_info is not None and k in self._tkr._quote._retired_info:
self._mcap = self._tkr._quote._retired_info[k]
else:
self._mcap = float(shares * self.last_price)
return self._mcap
class Quote:
@ -87,161 +553,87 @@ class Quote:
self._calendar = None
self._already_scraped = False
self._already_scraped_complementary = False
self._already_fetched = False
self._already_fetched_complementary = False
@property
def info(self) -> dict:
if self._info is None:
self._scrape(self.proxy)
self._scrape_complementary(self.proxy)
self._fetch(self.proxy)
self._fetch_complementary(self.proxy)
return self._info
@property
def sustainability(self) -> pd.DataFrame:
if self._sustainability is None:
self._scrape(self.proxy)
raise YFNotImplementedError('sustainability')
return self._sustainability
@property
def recommendations(self) -> pd.DataFrame:
if self._recommendations is None:
self._scrape(self.proxy)
raise YFNotImplementedError('recommendations')
return self._recommendations
@property
def calendar(self) -> pd.DataFrame:
if self._calendar is None:
self._scrape(self.proxy)
raise YFNotImplementedError('calendar')
return self._calendar
def _scrape(self, proxy):
if self._already_scraped:
def _fetch(self, proxy):
if self._already_fetched:
return
self._already_scraped = True
# get info and sustainability
json_data = self._data.get_json_data_stores(proxy=proxy)
try:
quote_summary_store = json_data['QuoteSummaryStore']
except KeyError:
err_msg = "No summary info found, symbol may be delisted"
print('- %s: %s' % (self._data.ticker, err_msg))
return None
# sustainability
d = {}
try:
if isinstance(quote_summary_store.get('esgScores'), dict):
for item in quote_summary_store['esgScores']:
if not isinstance(quote_summary_store['esgScores'][item], (dict, list)):
d[item] = quote_summary_store['esgScores'][item]
s = pd.DataFrame(index=[0], data=d)[-1:].T
s.columns = ['Value']
s.index.name = '%.f-%.f' % (
s[s.index == 'ratingYear']['Value'].values[0],
s[s.index == 'ratingMonth']['Value'].values[0])
self._sustainability = s[~s.index.isin(
['maxAge', 'ratingYear', 'ratingMonth'])]
except Exception:
pass
self._info = {}
try:
items = ['summaryProfile', 'financialData', 'quoteType',
'defaultKeyStatistics', 'assetProfile', 'summaryDetail']
for item in items:
if isinstance(quote_summary_store.get(item), dict):
self._info.update(quote_summary_store[item])
except Exception:
pass
# For ETFs, provide this valuable data: the top holdings of the ETF
try:
if 'topHoldings' in quote_summary_store:
self._info.update(quote_summary_store['topHoldings'])
except Exception:
pass
try:
if not isinstance(quote_summary_store.get('summaryDetail'), dict):
# For some reason summaryDetail did not give any results. The price dict
# usually has most of the same info
self._info.update(quote_summary_store.get('price', {}))
except Exception:
pass
try:
# self._info['regularMarketPrice'] = self._info['regularMarketOpen']
self._info['regularMarketPrice'] = quote_summary_store.get('price', {}).get(
'regularMarketPrice', self._info.get('regularMarketOpen', None))
except Exception:
pass
try:
self._info['preMarketPrice'] = quote_summary_store.get('price', {}).get(
'preMarketPrice', self._info.get('preMarketPrice', None))
except Exception:
pass
self._info['logo_url'] = ""
try:
if not 'website' in self._info:
self._info['logo_url'] = 'https://logo.clearbit.com/%s.com' % \
self._info['shortName'].split(' ')[0].split(',')[0]
self._already_fetched = True
modules = ['financialData', 'quoteType', 'defaultKeyStatistics', 'assetProfile', 'summaryDetail']
params_dict = {}
params_dict["modules"] = modules
params_dict["ssl"] = "true"
result = self._data.get_raw_json(
_BASIC_URL_ + f"/{self._data.ticker}", params=params_dict, proxy=proxy
)
result["quoteSummary"]["result"][0]["symbol"] = self._data.ticker
query1_info = next(
(info for info in result.get("quoteSummary", {}).get("result", []) if info["symbol"] == self._data.ticker),
None,
)
# Most keys that appear in multiple dicts have same value. Except 'maxAge' because
# Yahoo not consistent with days vs seconds. Fix it here:
for k in query1_info:
if "maxAge" in query1_info[k] and query1_info[k]["maxAge"] == 1:
query1_info[k]["maxAge"] = 86400
query1_info = {
k1: v1
for k, v in query1_info.items()
if isinstance(v, dict)
for k1, v1 in v.items()
if v1
}
# recursively format but only because of 'companyOfficers'
def _format(k, v):
if isinstance(v, dict) and "raw" in v and "fmt" in v:
v2 = v["fmt"] if k in {"regularMarketTime", "postMarketTime"} else v["raw"]
elif isinstance(v, list):
v2 = [_format(None, x) for x in v]
elif isinstance(v, dict):
v2 = {k:_format(k, x) for k, x in v.items()}
elif isinstance(v, str):
v2 = v.replace("\xa0", " ")
else:
domain = self._info['website'].split(
'://')[1].split('/')[0].replace('www.', '')
self._info['logo_url'] = 'https://logo.clearbit.com/%s' % domain
except Exception:
pass
v2 = v
return v2
for k, v in query1_info.items():
query1_info[k] = _format(k, v)
self._info = query1_info
# Delete redundant info[] keys, because values can be accessed faster
# elsewhere - e.g. price keys. Hope is reduces Yahoo spam effect.
# But record the dropped keys, because in rare cases they are needed.
self._retired_info = {}
for k in info_retired_keys:
if k in self._info:
self._retired_info[k] = self._info[k]
if PRUNE_INFO:
del self._info[k]
if PRUNE_INFO:
# InfoDictWrapper will explain how to access above data elsewhere
self._info = InfoDictWrapper(self._info)
# events
try:
cal = pd.DataFrame(quote_summary_store['calendarEvents']['earnings'])
cal['earningsDate'] = pd.to_datetime(
cal['earningsDate'], unit='s')
self._calendar = cal.T
self._calendar.index = utils.camel2title(self._calendar.index)
self._calendar.columns = ['Value']
except Exception as e:
pass
# analyst recommendations
try:
rec = pd.DataFrame(
quote_summary_store['upgradeDowngradeHistory']['history'])
rec['earningsDate'] = pd.to_datetime(
rec['epochGradeDate'], unit='s')
rec.set_index('earningsDate', inplace=True)
rec.index.name = 'Date'
rec.columns = utils.camel2title(rec.columns)
self._recommendations = rec[[
'Firm', 'To Grade', 'From Grade', 'Action']].sort_index()
except Exception:
pass
def _scrape_complementary(self, proxy):
if self._already_scraped_complementary:
def _fetch_complementary(self, proxy):
if self._already_fetched_complementary:
return
self._already_scraped_complementary = True
self._already_fetched_complementary = True
self._scrape(proxy)
# self._scrape(proxy) # decrypt broken
self._fetch(proxy)
if self._info is None:
return

View File

@ -4,3 +4,5 @@ e9a8ab8e5620b712ebc2fb4f33d5c8b9c80c0d07e8c371911c785cf674789f1747d76a909510158a
6ae2523aeafa283dad746556540145bf603f44edbf37ad404d3766a8420bb5eb1d3738f52a227b88283cca9cae44060d5f0bba84b6a495082589f5fe7acbdc9e
3365117c2a368ffa5df7313a4a84988f73926a86358e8eea9497c5ff799ce27d104b68e5f2fbffa6f8f92c1fef41765a7066fa6bcf050810a9c4c7872fd3ebf0
15d8f57919857d5a5358d2082c7ef0f1129cfacd2a6480333dcfb954b7bb67d820abefebfdb0eaa6ef18a1c57f617b67d7e7b0ec040403b889630ae5db5a4dbb
db9630d707a7d0953ac795cd8db1ca9ca6c9d8239197cdfda24b4e0ec9c37eaec4db82dab68b8f606ab7b5b4af3e65dab50606f8cf508269ec927e6ee605fb78
3c895fb5ddcc37d20d3073ed74ee3efad59bcb147c8e80fd279f83701b74b092d503dcd399604c6d8be8f3013429d3c2c76ed5b31b80c9df92d5eab6d3339fce

View File

@ -22,4 +22,5 @@
_DFS = {}
_PROGRESS_BAR = None
_ERRORS = {}
_TRACEBACKS = {}
_ISINS = {}

View File

@ -239,6 +239,10 @@ class Ticker(TickerBase):
def news(self):
return self.get_news()
@property
def trend_details(self) -> _pd.DataFrame:
return self.get_trend_details()
@property
def earnings_trend(self) -> _pd.DataFrame:
return self.get_earnings_trend()

View File

@ -87,10 +87,4 @@ class Tickers:
return data
def news(self):
collection = {}
for ticker in self.symbols:
collection[ticker] = []
items = Ticker(ticker).news
for item in items:
collection[ticker].append(item)
return collection
return {ticker: [item for item in Ticker(ticker).news] for ticker in self.symbols}

View File

@ -35,6 +35,8 @@ import os as _os
import appdirs as _ad
import sqlite3 as _sqlite3
import atexit as _atexit
from functools import lru_cache
import logging
from threading import Lock
@ -61,6 +63,115 @@ def attributes(obj):
if name[0] != '_' and name not in disallowed_names and hasattr(obj, name)}
@lru_cache(maxsize=20)
def print_once(msg):
# 'warnings' module suppression of repeat messages does not work.
# This function replicates correct behaviour
print(msg)
## Logging
# Note: most of this logic is adding indentation with function depth,
# so that DEBUG log is readable.
class IndentLoggerAdapter(logging.LoggerAdapter):
def process(self, msg, kwargs):
if get_yf_logger().isEnabledFor(logging.DEBUG):
i = ' ' * self.extra['indent']
if not isinstance(msg, str):
msg = str(msg)
msg = '\n'.join([i + m for m in msg.split('\n')])
return msg, kwargs
import threading
_indentation_level = threading.local()
class IndentationContext:
def __init__(self, increment=1):
self.increment = increment
def __enter__(self):
_indentation_level.indent = getattr(_indentation_level, 'indent', 0) + self.increment
def __exit__(self, exc_type, exc_val, exc_tb):
_indentation_level.indent -= self.increment
def get_indented_logger(name=None):
# Never cache the returned value! Will break indentation.
return IndentLoggerAdapter(logging.getLogger(name), {'indent': getattr(_indentation_level, 'indent', 0)})
def log_indent_decorator(func):
def wrapper(*args, **kwargs):
logger = get_indented_logger('yfinance')
logger.debug(f'Entering {func.__name__}()')
with IndentationContext():
result = func(*args, **kwargs)
logger.debug(f'Exiting {func.__name__}()')
return result
return wrapper
class MultiLineFormatter(logging.Formatter):
# The 'fmt' formatting further down is only applied to first line
# of log message, specifically the padding after %level%.
# For multi-line messages, need to manually copy over padding.
def __init__(self, fmt):
super().__init__(fmt)
# Extract amount of padding
match = _re.search(r'%\(levelname\)-(\d+)s', fmt)
self.level_length = int(match.group(1)) if match else 0
def format(self, record):
original = super().format(record)
lines = original.split('\n')
levelname = lines[0].split(' ')[0]
if len(lines) <= 1:
return original
else:
# Apply padding to all lines below first
formatted = [lines[0]]
if self.level_length == 0:
padding = ' ' * len(levelname)
else:
padding = ' ' * self.level_length
padding += ' ' # +1 for space between level and message
formatted.extend(padding + line for line in lines[1:])
return '\n'.join(formatted)
yf_logger = None
yf_log_indented = False
def get_yf_logger():
global yf_logger
if yf_logger is None:
yf_logger = logging.getLogger('yfinance')
global yf_log_indented
if yf_log_indented:
yf_logger = get_indented_logger('yfinance')
return yf_logger
def setup_debug_formatting():
global yf_logger
yf_logger = get_yf_logger()
if not yf_logger.isEnabledFor(logging.DEBUG):
yf_logger.warning("logging mode not set to 'DEBUG', so not setting up debug formatting")
return
if yf_logger.handlers is None or len(yf_logger.handlers) == 0:
h = logging.StreamHandler()
# Ensure different level strings don't interfere with indentation
formatter = MultiLineFormatter(fmt='%(levelname)-8s %(message)s')
h.setFormatter(formatter)
yf_logger.addHandler(h)
global yf_log_indented
yf_log_indented = True
def enable_debug_mode():
get_yf_logger().setLevel(logging.DEBUG)
setup_debug_formatting()
##
def is_isin(string):
return bool(_re.match("^([A-Z]{2})([A-Z0-9]{9})([0-9]{1})$", string))
@ -330,7 +441,7 @@ def _interval_to_timedelta(interval):
elif interval == "1y":
return _dateutil.relativedelta.relativedelta(years=1)
elif interval == "1wk":
return _pd.Timedelta(days=7, unit='d')
return _pd.Timedelta(days=7)
else:
return _pd.Timedelta(interval)
@ -338,10 +449,10 @@ def _interval_to_timedelta(interval):
def auto_adjust(data):
col_order = data.columns
df = data.copy()
ratio = df["Close"] / df["Adj Close"]
df["Adj Open"] = df["Open"] / ratio
df["Adj High"] = df["High"] / ratio
df["Adj Low"] = df["Low"] / ratio
ratio = (df["Adj Close"] / df["Close"]).to_numpy()
df["Adj Open"] = df["Open"] * ratio
df["Adj High"] = df["High"] * ratio
df["Adj Low"] = df["Low"] * ratio
df.drop(
["Open", "High", "Low", "Close"],
@ -404,12 +515,9 @@ def parse_quotes(data):
def parse_actions(data):
dividends = _pd.DataFrame(
columns=["Dividends"], index=_pd.DatetimeIndex([]))
capital_gains = _pd.DataFrame(
columns=["Capital Gains"], index=_pd.DatetimeIndex([]))
splits = _pd.DataFrame(
columns=["Stock Splits"], index=_pd.DatetimeIndex([]))
dividends = None
capital_gains = None
splits = None
if "events" in data:
if "dividends" in data["events"]:
@ -438,6 +546,16 @@ def parse_actions(data):
splits["denominator"]
splits = splits[["Stock Splits"]]
if dividends is None:
dividends = _pd.DataFrame(
columns=["Dividends"], index=_pd.DatetimeIndex([]))
if capital_gains is None:
capital_gains = _pd.DataFrame(
columns=["Capital Gains"], index=_pd.DatetimeIndex([]))
if splits is None:
splits = _pd.DataFrame(
columns=["Stock Splits"], index=_pd.DatetimeIndex([]))
return dividends, splits, capital_gains
@ -448,31 +566,30 @@ def set_df_tz(df, interval, tz):
return df
def fix_Yahoo_returning_prepost_unrequested(quotes, interval, metadata):
def fix_Yahoo_returning_prepost_unrequested(quotes, interval, tradingPeriods):
# Sometimes Yahoo returns post-market data despite not requesting it.
# Normally happens on half-day early closes.
#
# And sometimes returns pre-market data despite not requesting it.
# E.g. some London tickers.
tps_df = metadata["tradingPeriods"]
tps_df = tradingPeriods.copy()
tps_df["_date"] = tps_df.index.date
quotes["_date"] = quotes.index.date
idx = quotes.index.copy()
quotes = quotes.merge(tps_df, how="left", validate="many_to_one")
quotes = quotes.merge(tps_df, how="left")
quotes.index = idx
# "end" = end of regular trading hours (including any auction)
f_drop = quotes.index >= quotes["end"]
f_drop = f_drop | (quotes.index < quotes["start"])
if f_drop.any():
# When printing report, ignore rows that were already NaNs:
f_na = quotes[["Open","Close"]].isna().all(axis=1)
n_nna = quotes.shape[0] - _np.sum(f_na)
n_drop_nna = _np.sum(f_drop & ~f_na)
quotes_dropped = quotes[f_drop]
# f_na = quotes[["Open","Close"]].isna().all(axis=1)
# n_nna = quotes.shape[0] - _np.sum(f_na)
# n_drop_nna = _np.sum(f_drop & ~f_na)
# quotes_dropped = quotes[f_drop]
# if debug and n_drop_nna > 0:
# print(f"Dropping {n_drop_nna}/{n_nna} intervals for falling outside regular trading hours")
quotes = quotes[~f_drop]
metadata["tradingPeriods"] = tps_df.drop(["_date"], axis=1)
quotes = quotes.drop(["_date", "start", "end"], axis=1)
return quotes
@ -511,16 +628,24 @@ def fix_Yahoo_returning_live_separate(quotes, interval, tz_exchange):
# Last two rows are within same interval
idx1 = quotes.index[n - 1]
idx2 = quotes.index[n - 2]
if idx1 == idx2:
# Yahoo returning last interval duplicated, which means
# Yahoo is not returning live data (phew!)
return quotes
if _np.isnan(quotes.loc[idx2, "Open"]):
quotes.loc[idx2, "Open"] = quotes["Open"][n - 1]
# Note: nanmax() & nanmin() ignores NaNs
quotes.loc[idx2, "High"] = _np.nanmax([quotes["High"][n - 1], quotes["High"][n - 2]])
quotes.loc[idx2, "Low"] = _np.nanmin([quotes["Low"][n - 1], quotes["Low"][n - 2]])
# Note: nanmax() & nanmin() ignores NaNs, but still need to check not all are NaN to avoid warnings
if not _np.isnan(quotes["High"][n - 1]):
quotes.loc[idx2, "High"] = _np.nanmax([quotes["High"][n - 1], quotes["High"][n - 2]])
if "Adj High" in quotes.columns:
quotes.loc[idx2, "Adj High"] = _np.nanmax([quotes["Adj High"][n - 1], quotes["Adj High"][n - 2]])
if not _np.isnan(quotes["Low"][n - 1]):
quotes.loc[idx2, "Low"] = _np.nanmin([quotes["Low"][n - 1], quotes["Low"][n - 2]])
if "Adj Low" in quotes.columns:
quotes.loc[idx2, "Adj Low"] = _np.nanmin([quotes["Adj Low"][n - 1], quotes["Adj Low"][n - 2]])
quotes.loc[idx2, "Close"] = quotes["Close"][n - 1]
if "Adj High" in quotes.columns:
quotes.loc[idx2, "Adj High"] = _np.nanmax([quotes["Adj High"][n - 1], quotes["Adj High"][n - 2]])
if "Adj Low" in quotes.columns:
quotes.loc[idx2, "Adj Low"] = _np.nanmin([quotes["Adj Low"][n - 1], quotes["Adj Low"][n - 2]])
if "Adj Close" in quotes.columns:
quotes.loc[idx2, "Adj Close"] = quotes["Adj Close"][n - 1]
quotes.loc[idx2, "Volume"] += quotes["Volume"][n - 1]
@ -530,11 +655,6 @@ def fix_Yahoo_returning_live_separate(quotes, interval, tz_exchange):
def safe_merge_dfs(df_main, df_sub, interval):
# Carefully merge 'df_sub' onto 'df_main'
# If naive merge fails, try again with reindexing df_sub:
# 1) if interval is weekly or monthly, then try with index set to start of week/month
# 2) if still failing then manually search through df_main.index to reindex df_sub
if df_sub.shape[0] == 0:
raise Exception("No data to merge")
@ -544,6 +664,65 @@ def safe_merge_dfs(df_main, df_sub, interval):
raise Exception("Expected 1 data col")
data_col = data_cols[0]
df_main = df_main.sort_index()
intraday = interval.endswith('m') or interval.endswith('s')
td = _interval_to_timedelta(interval)
if intraday:
# On some exchanges the event can occur before market open.
# Problem when combining with intraday data.
# Solution = use dates, not datetimes, to map/merge.
df_main['_date'] = df_main.index.date
df_sub['_date'] = df_sub.index.date
indices = _np.searchsorted(_np.append(df_main['_date'], [df_main['_date'].iloc[-1]+td]), df_sub['_date'], side='left')
df_main = df_main.drop('_date', axis=1)
df_sub = df_sub.drop('_date', axis=1)
else:
indices = _np.searchsorted(_np.append(df_main.index, df_main.index[-1]+td), df_sub.index, side='right')
indices -= 1 # Convert from [[i-1], [i]) to [[i], [i+1])
# Numpy.searchsorted does not handle out-of-range well, so handle manually:
for i in range(len(df_sub.index)):
dt = df_sub.index[i]
if dt < df_main.index[0] or dt >= df_main.index[-1]+td:
# Out-of-range
indices[i] = -1
f_outOfRange = indices == -1
if f_outOfRange.any() and not intraday:
# If dividend is occuring in next interval after last price row,
# add a new row of NaNs
last_dt = df_main.index[-1]
next_interval_start_dt = last_dt + td
if interval == '1d':
# Allow for weekends & holidays
next_interval_end_dt = last_dt+7*_pd.Timedelta(days=7)
else:
next_interval_end_dt = next_interval_start_dt + td
for i in _np.where(f_outOfRange)[0]:
dt = df_sub.index[i]
if dt >= next_interval_start_dt and dt < next_interval_end_dt:
new_dt = dt if interval == '1d' else next_interval_start_dt
get_yf_logger().debug(f"Adding out-of-range {data_col} @ {dt.date()} in new prices row of NaNs")
df_main.loc[new_dt] = _np.nan
# Re-calculate indices
indices = _np.searchsorted(_np.append(df_main.index, df_main.index[-1]+td), df_sub.index, side='right')
indices -= 1 # Convert from [[i-1], [i]) to [[i], [i+1])
# Numpy.searchsorted does not handle out-of-range well, so handle manually:
for i in range(len(df_sub.index)):
dt = df_sub.index[i]
if dt < df_main.index[0] or dt >= df_main.index[-1]+td:
# Out-of-range
indices[i] = -1
f_outOfRange = indices == -1
if f_outOfRange.any():
if intraday or interval in ['1d', '1wk']:
raise Exception(f"The following '{data_col}' events are out-of-range, did not expect with interval {interval}: {df_sub.index}")
get_yf_logger().debug(f'Discarding these {data_col} events:' + '\n' + str(df_sub[f_outOfRange]))
df_sub = df_sub[~f_outOfRange].copy()
indices = indices[~f_outOfRange]
def _reindex_events(df, new_index, data_col_name):
if len(new_index) == len(set(new_index)):
# No duplicates, easy
@ -552,7 +731,7 @@ def safe_merge_dfs(df_main, df_sub, interval):
df["_NewIndex"] = new_index
# Duplicates present within periods but can aggregate
if data_col_name == "Dividends":
if data_col_name in ["Dividends", "Capital Gains"]:
# Add
df = df.groupby("_NewIndex").sum()
df.index.name = None
@ -565,106 +744,14 @@ def safe_merge_dfs(df_main, df_sub, interval):
if "_NewIndex" in df.columns:
df = df.drop("_NewIndex", axis=1)
return df
df = df_main.join(df_sub)
f_na = df[data_col].isna()
data_lost = sum(~f_na) < df_sub.shape[0]
if not data_lost:
return df
# Lost data during join()
# Backdate all df_sub.index dates to start of week/month
if interval == "1wk":
new_index = _pd.PeriodIndex(df_sub.index, freq='W').to_timestamp()
elif interval == "1mo":
new_index = _pd.PeriodIndex(df_sub.index, freq='M').to_timestamp()
elif interval == "3mo":
new_index = _pd.PeriodIndex(df_sub.index, freq='Q').to_timestamp()
else:
new_index = None
if new_index is not None:
new_index = new_index.tz_localize(df.index.tz, ambiguous=True, nonexistent='shift_forward')
df_sub = _reindex_events(df_sub, new_index, data_col)
df = df_main.join(df_sub)
f_na = df[data_col].isna()
data_lost = sum(~f_na) < df_sub.shape[0]
if not data_lost:
return df
# Lost data during join(). Manually check each df_sub.index date against df_main.index to
# find matching interval
df_sub = df_sub_backup.copy()
new_index = [-1] * df_sub.shape[0]
for i in range(df_sub.shape[0]):
dt_sub_i = df_sub.index[i]
if dt_sub_i in df_main.index:
new_index[i] = dt_sub_i
continue
# Found a bad index date, need to search for near-match in df_main (same week/month)
fixed = False
for j in range(df_main.shape[0] - 1):
dt_main_j0 = df_main.index[j]
dt_main_j1 = df_main.index[j + 1]
if (dt_main_j0 <= dt_sub_i) and (dt_sub_i < dt_main_j1):
fixed = True
if interval.endswith('h') or interval.endswith('m'):
# Must also be same day
fixed = (dt_main_j0.date() == dt_sub_i.date()) and (dt_sub_i.date() == dt_main_j1.date())
if fixed:
dt_sub_i = dt_main_j0
break
if not fixed:
last_main_dt = df_main.index[df_main.shape[0] - 1]
diff = dt_sub_i - last_main_dt
if interval == "1mo" and last_main_dt.month == dt_sub_i.month:
dt_sub_i = last_main_dt
fixed = True
elif interval == "3mo" and last_main_dt.year == dt_sub_i.year and last_main_dt.quarter == dt_sub_i.quarter:
dt_sub_i = last_main_dt
fixed = True
elif interval == "1wk":
if last_main_dt.week == dt_sub_i.week:
dt_sub_i = last_main_dt
fixed = True
elif (dt_sub_i >= last_main_dt) and (dt_sub_i - last_main_dt < _datetime.timedelta(weeks=1)):
# With some specific start dates (e.g. around early Jan), Yahoo
# messes up start-of-week, is Saturday not Monday. So check
# if same week another way
dt_sub_i = last_main_dt
fixed = True
elif interval == "1d" and last_main_dt.day == dt_sub_i.day:
dt_sub_i = last_main_dt
fixed = True
elif interval == "1h" and last_main_dt.hour == dt_sub_i.hour:
dt_sub_i = last_main_dt
fixed = True
elif interval.endswith('m') or interval.endswith('h'):
td = _pd.to_timedelta(interval)
if (dt_sub_i >= last_main_dt) and (dt_sub_i - last_main_dt < td):
dt_sub_i = last_main_dt
fixed = True
new_index[i] = dt_sub_i
new_index = df_main.index[indices]
df_sub = _reindex_events(df_sub, new_index, data_col)
df = df_main.join(df_sub)
df = df_main.join(df_sub)
f_na = df[data_col].isna()
data_lost = sum(~f_na) < df_sub.shape[0]
if data_lost:
## Not always possible to match events with trading, e.g. when released pre-market.
## So have to append to bottom with nan prices.
## But should only be impossible with intra-day price data.
if interval.endswith('m') or interval.endswith('h') or interval == "1d":
# Update: is possible with daily data when dividend very recent
f_missing = ~df_sub.index.isin(df.index)
df_sub_missing = df_sub[f_missing].copy()
keys = {"Adj Open", "Open", "Adj High", "High", "Adj Low", "Low", "Adj Close",
"Close"}.intersection(df.columns)
df_sub_missing[list(keys)] = _np.nan
col_ordering = df.columns
df = _pd.concat([df, df_sub_missing], sort=True)[col_ordering]
else:
raise Exception("Lost data during merge despite all attempts to align data (see above)")
raise Exception('Data was lost in merge, investigate')
return df
@ -690,7 +777,7 @@ def is_valid_timezone(tz: str) -> bool:
return True
def format_history_metadata(md):
def format_history_metadata(md, tradingPeriodsOnly=True):
if not isinstance(md, dict):
return md
if len(md) == 0:
@ -698,60 +785,54 @@ def format_history_metadata(md):
tz = md["exchangeTimezoneName"]
for k in ["firstTradeDate", "regularMarketTime"]:
if k in md:
md[k] = _pd.to_datetime(md[k], unit='s', utc=True).tz_convert(tz)
if not tradingPeriodsOnly:
for k in ["firstTradeDate", "regularMarketTime"]:
if k in md and md[k] is not None:
if isinstance(md[k], int):
md[k] = _pd.to_datetime(md[k], unit='s', utc=True).tz_convert(tz)
if "currentTradingPeriod" in md:
for m in ["regular", "pre", "post"]:
if m in md["currentTradingPeriod"]:
for t in ["start", "end"]:
md["currentTradingPeriod"][m][t] = \
_pd.to_datetime(md["currentTradingPeriod"][m][t], unit='s', utc=True).tz_convert(tz)
del md["currentTradingPeriod"][m]["gmtoffset"]
del md["currentTradingPeriod"][m]["timezone"]
if "tradingPeriods" in md:
if md["tradingPeriods"] == {"pre":[], "post":[]}:
del md["tradingPeriods"]
if "currentTradingPeriod" in md:
for m in ["regular", "pre", "post"]:
if m in md["currentTradingPeriod"] and isinstance(md["currentTradingPeriod"][m]["start"], int):
for t in ["start", "end"]:
md["currentTradingPeriod"][m][t] = \
_pd.to_datetime(md["currentTradingPeriod"][m][t], unit='s', utc=True).tz_convert(tz)
del md["currentTradingPeriod"][m]["gmtoffset"]
del md["currentTradingPeriod"][m]["timezone"]
if "tradingPeriods" in md:
tps = md["tradingPeriods"]
if isinstance(tps, list):
# Only regular times
regs_dict = [tps[i][0] for i in range(len(tps))]
pres_dict = None
posts_dict = None
elif isinstance(tps, dict):
# Includes pre- and post-market
pres_dict = [tps["pre"][i][0] for i in range(len(tps["pre"]))]
posts_dict = [tps["post"][i][0] for i in range(len(tps["post"]))]
regs_dict = [tps["regular"][i][0] for i in range(len(tps["regular"]))]
else:
raise Exception()
if tps == {"pre":[], "post":[]}:
# Ignore
pass
elif isinstance(tps, (list, dict)):
if isinstance(tps, list):
# Only regular times
df = _pd.DataFrame.from_records(_np.hstack(tps))
df = df.drop(["timezone", "gmtoffset"], axis=1)
df["start"] = _pd.to_datetime(df["start"], unit='s', utc=True).dt.tz_convert(tz)
df["end"] = _pd.to_datetime(df["end"], unit='s', utc=True).dt.tz_convert(tz)
elif isinstance(tps, dict):
# Includes pre- and post-market
pre_df = _pd.DataFrame.from_records(_np.hstack(tps["pre"]))
post_df = _pd.DataFrame.from_records(_np.hstack(tps["post"]))
regular_df = _pd.DataFrame.from_records(_np.hstack(tps["regular"]))
pre_df = pre_df.rename(columns={"start":"pre_start", "end":"pre_end"}).drop(["timezone", "gmtoffset"], axis=1)
post_df = post_df.rename(columns={"start":"post_start", "end":"post_end"}).drop(["timezone", "gmtoffset"], axis=1)
regular_df = regular_df.drop(["timezone", "gmtoffset"], axis=1)
cols = ["pre_start", "pre_end", "start", "end", "post_start", "post_end"]
df = regular_df.join(pre_df).join(post_df)
for c in cols:
df[c] = _pd.to_datetime(df[c], unit='s', utc=True).dt.tz_convert(tz)
df = df[cols]
def _dict_to_table(d):
df = _pd.DataFrame.from_dict(d).drop(["timezone", "gmtoffset"], axis=1)
df["end"] = _pd.to_datetime(df["end"], unit='s', utc=True).dt.tz_convert(tz)
df["start"] = _pd.to_datetime(df["start"], unit='s', utc=True).dt.tz_convert(tz)
df.index = _pd.to_datetime(df["start"].dt.date)
df.index = df.index.tz_localize(tz)
return df
df.index.name = "Date"
df = _dict_to_table(regs_dict)
df_cols = ["start", "end"]
if pres_dict is not None:
pre_df = _dict_to_table(pres_dict)
df = df.merge(pre_df.rename(columns={"start":"pre_start", "end":"pre_end"}), left_index=True, right_index=True)
df_cols = ["pre_start", "pre_end"]+df_cols
if posts_dict is not None:
post_df = _dict_to_table(posts_dict)
df = df.merge(post_df.rename(columns={"start":"post_start", "end":"post_end"}), left_index=True, right_index=True)
df_cols = df_cols+["post_start", "post_end"]
df = df[df_cols]
df.index.name = "Date"
md["tradingPeriods"] = df
md["tradingPeriods"] = df
return md
@ -836,14 +917,21 @@ class _KVStore:
def get(self, key: str) -> Union[str, None]:
"""Get value for key if it exists else returns None"""
item = self.conn.execute('select value from "kv" where key=?', (key,))
try:
item = self.conn.execute('select value from "kv" where key=?', (key,))
except _sqlite3.IntegrityError as e:
self.delete(key)
return None
if item:
return next(item, (None,))[0]
def set(self, key: str, value: str) -> None:
with self._cache_mutex:
self.conn.execute('replace into "kv" (key, value) values (?,?)', (key, value))
self.conn.commit()
if value is None:
self.delete(key)
else:
with self._cache_mutex:
self.conn.execute('replace into "kv" (key, value) values (?,?)', (key, value))
self.conn.commit()
def bulk_set(self, kvdata: Dict[str, str]):
records = tuple(i for i in kvdata.items())
@ -867,7 +955,11 @@ class _TzCache:
def __init__(self):
self._setup_cache_folder()
# Must init db here, where is thread-safe
self._tz_db = _KVStore(_os.path.join(self._db_dir, "tkr-tz.db"))
try:
self._tz_db = _KVStore(_os.path.join(self._db_dir, "tkr-tz.db"))
except _sqlite3.DatabaseError as err:
raise _TzCacheException("Error creating TzCache folder: '{}' reason: {}"
.format(self._db_dir, err))
self._migrate_cache_tkr_tz()
def _setup_cache_folder(self):
@ -909,11 +1001,23 @@ class _TzCache:
if not _os.path.isfile(old_cache_file_path):
return None
try:
df = _pd.read_csv(old_cache_file_path, index_col="Ticker")
df = _pd.read_csv(old_cache_file_path, index_col="Ticker", on_bad_lines="skip")
except _pd.errors.EmptyDataError:
_os.remove(old_cache_file_path)
except TypeError:
_os.remove(old_cache_file_path)
else:
self.tz_db.bulk_set(df.to_dict()['Tz'])
# Discard corrupt data:
df = df[~df["Tz"].isna().to_numpy()]
df = df[~(df["Tz"]=='').to_numpy()]
df = df[~df.index.isna()]
if not df.empty:
try:
self.tz_db.bulk_set(df.to_dict()['Tz'])
except Exception as e:
# Ignore
pass
_os.remove(old_cache_file_path)
@ -944,9 +1048,10 @@ def get_tz_cache():
try:
_tz_cache = _TzCache()
except _TzCacheException as err:
print("Failed to create TzCache, reason: {}".format(err))
print("TzCache will not be used.")
print("Tip: You can direct cache to use a different location with 'set_tz_cache_location(mylocation)'")
get_yf_logger().info("Failed to create TzCache, reason: %s. "
"TzCache will not be used. "
"Tip: You can direct cache to use a different location with 'set_tz_cache_location(mylocation)'",
err)
_tz_cache = _TzCacheDummy()
return _tz_cache

View File

@ -1 +1 @@
version = "0.2.10"
version = "0.2.24"