Highcharts is a world leading provider of accessible charting tools for the web, used by 80 of the top 100 Fortune companies. Recently Highcharts and global publishing company Elsevier’s Digital Accessibility Team collaborated to provide better accessibility for line charts with large datasets. Line charts are often used to visualize datasets with thousands of data points. This presents a challenge for non-visual access, as providing access to individual data points is not sufficient. A reader of a line chart with a large amount of data will aim to extract information about trends, patterns, and outliers from the chart. Can we make this information more accessible by communicating it through text and sound? What is the most intuitive way to experience this data through sound? And to which extent can we automate the text description? Human authored text descriptions of charts are historically difficult to beat, but can in many cases be impractical – such as where data is dynamically loaded in real-time. Automated text descriptions can also be designed to be more objective and less prone to biases. Will users be able to trust these descriptions? Will they still prefer those created by a human? With each of the new accessibility research questions we will provide user feedback from non-sighted users on our approaches. We will share findings about best practices, and show screen reader demos to help illustrate design considerations.